12 December 2019 2 7K Report

As far as I know, entropy of a quantized source H(X) can be viewed as the minimum number of bits needed to encode the information. My question is: Is entropy an appropriate metric of real-valued data distortion in a wireless sensor network (WSN)?

For example, there are N deployed temperature sensors in WSN. Due to spatial correlation, the measurements of near sensors have high correlation. Now we assumed the collection of data gathered by all machines follows a multi-variate gaussian distribution. Since gaussian source is continuous, we also assume that each sensor quantizes the measured temperature data with a quantization level \delta. Now the entropy of source can be calculated instead of infinity. Supposed that the resource is limited, we can only choose a subset sensors S to transmit their data, where |S| = M and M

Similar questions and discussions