I've decided to repeat my question asked in December 2014. Since that time it was more 400 times viewed but only one person decided to touch the topic. The question is not trivial and, in my opinion, is worth of a wider discussion.
From general point of view, every pair “sensor node (SN)-base station (BS)” is not more than remote measurement device, and it is some surprising that MSE - universal and commonly used in measurements is not used in communications, in particular, in WSN theory and practice. Moreover, MSE of transmission can be easily measured - it is enough to average some hundreds of quadrates of differences between the samples of reference input and recovered by BS signals. This would give adequate, accurate, and clear assessment of the quality of transmission.
Nevertheless, performance of analog signals transmission is assessed now by the bit rate (throughput), power – bandwidth efficiency and BER.
However, these characteristics describe completeness of the channel resources utilization but not accuracy of the signals recover. It is impossible to asses MSE having BER which does not show - the more or less significant bits of the received code were distorted. The transmitted packages include address and error control bits added to the groups of information bits of compressed sequencies of original code words formed by the input ADC.
Moreover, its measurement are very complex and time consuming. and their results depend on the scenario of SN-BS application (distance, surrounding, etc.). This additionally reduces connections between BER and quality of the link SN-BS. One may add that, apart from channel errors, always analog sensors add additional uncertainty into the signals routed SN transmitter. Thus, what BER describes?
This permits to conclude that BER is only generalized and rough characteristic of the quality of transmission. Do you agree? And why MSE is not used?
Article New Approach to Improvement and Measurement of the Performan...