Speaker
Description
Globalized Data Dissemination in near real time is a challenge. Data replication as an alternative is cost intensive and delayed. Therefore, one of the possible solutions is to develop a protocol that classifies data at source. This sorting of data is based on spatial, temporal and spectral significances defined at the source. Parametric thresholds, i.e. boundary conditions for classification, are part of the architecture resulting in propagation ready warehousing of data at source. Optimizations are also achieved by propagating only variations to the standard data or most specific conditions. This results in a sizable reduction of data transmitted over the networks. The complete data set or subset, as may be applicable, can be rebuilt at the remote site by adding variations to the standard conditions generating data as recorded by the sensors. This protocol can be implemented at the transport or higher layers of the Open Systems Interconnection model. New fields are required in the existing database designs to accommodate only keeping records of received variations. A furtherance to this optimization can be achieved by only recording variations at sensors even prior to the analogue to digital conversions and transmission can also be investigated with possibly fascinating outcomes.