The RMS error calculated has to be compared with the normal value observed. For example the TCP throughput predicted by a single user model or measured by Tamosoft throughput test software is about 5.8Mbps if the SNR>30dB. If the RMS error observed when using the model is 0.5Mbps, then the percentage error =(0.5/5.8)x100= 8.6%. The acceptable error value will therefore depend on the application. A high error can be tolerated by some applications while some applications require more precision. However, we are trying to find out if a maximum limit has been set as a standard for different applications?
Thanks Joel, it was helpful. I got quite some insight but it considered only maximum calculated throughput for IEEE802.11b for voice signal. However, we want to know if for different applications a maximum allowed RMS error has been fixed for a model to be considered acceptable?