Please note that there is a tradeoff between energy consumption (Battery Life) and performance (which affects the delay etc) - The acceptable delay depends on the APPLICATION and its requirements
Some applications require rates of several (samples / second) others require only (several /hour)
The total delay will depend on the delay between of each hop of the WSN - and the topology of the WSN
Please google search "Probabilistic Estimation of End-to-End Path Latency in Wireless Sensor Networks"
Please note that there is a tradeoff between energy consumption (Battery Life) and performance (which affects the delay etc) - The acceptable delay depends on the APPLICATION and its requirements
Some applications require rates of several (samples / second) others require only (several /hour)
The total delay will depend on the delay between of each hop of the WSN - and the topology of the WSN
Please google search "Probabilistic Estimation of End-to-End Path Latency in Wireless Sensor Networks"
I am totally agree with George, the things totally depend on the situation... having mobile sensor nodes, will u rely on proactive routing, your nodes often change the locations, your application found route broken? what if you are relying on reactive routing, could be suitable, every time ur apps send something, route build on the demand.. Again its all about trade off how would you tune up the situation..
I shall add that the total delay depends on the MAC protocol, the routing protocol, congestion level (tx queue length), number of hops and channel quality if the MAC protocol supports retransmission in case of packet loss.
For the MAC protocol, WSN use usually duty cycling to reduce energy consumption. Lower the duty cycles result in lower energy consumption but high end-to-end delay, and low throughput and vice versa.
Consequently, I think that you should investigate the application requirements (acceptable delay, packet loss, desired network lifetime) and derive the best parameters values for the lower layers you use which satisfies your requirements.
To add to the answers above: think of the end-to-end delay as it affects your application. The way I tend to think about the network is as a large distributed compute fabric, with sensors injecting data streams into the network, and the network computing on that data. If you have sensor nets that generate a lot of data, you will need to compute in the network as not to overload the aggregation trunks. If you have trickle networks then such issues would not be an issue, but your network needs to be able to buffer. And everything in between.
So, take a look at the application that needs to be supported and then do a top level SDF analysis to see if you have any infinite queues developing. If so, you need to mitigate those bottlenecks.
One final comment: end-to-end delay should be a parameter that does NOT end up as a parameter in your network spec. Two reasons: 1- if you put it there, then applications will be written that depend on it, and 2) if you want to expand your network you will violate that spec at some point.
Think Internet as your ultimate sensor network and use its design parameters as your network logical extensions.
Dear all, thank you very much for the explanation.
Basically I'm using fundamentals of LEACH Protocol in my heterogeneous architecture development, and I think delay of a fraction of 1/100th second is acceptable for 2-event sensing applications.
The last point by Theodore Omtzigt is really notable, end-to-end delay should not be an END-UP parameter.
There is another parameter which has to be considered: time interval accepted by the process for a correct control. Sensors are used for data acquisition and data are used for process control. Delays introduce a phase shift which affect the loop stability. In fact many processes where wireless data acquisition is used are low speed so even high (all is relative) delays can be accepted. It is not necessary to use data acquisition rates higher than required since this has negative influences on time between batteries recharging and can generate data bottle neck increasing delay for other data transfers.
Smitha... for the 500m range and also as Theodore Omtzigt mentioned above - you have to think that the INTERNET is your ultimate sensor NETWORK - That means that you can use (and combine) 802.11 as well... You can use directional antennas (if required) (panel antennas, YAGI) if its P2P (point to point application) for increased bandwidth and increased noise immunity / system reliability etc etc ..
In a lab, yes, the value are correct, but it also depends on how the WSN are configured as mention above. In many large real world deployment latency can grow a lot. Thats why I agree what Theodore Omtzigt say, (if possible) don't tell. Let the system expect much larger delay to make plenty of room for whatever delay it will be. For example, how long will the latency be if you get some topology changes or outage? The system should be able to handle such case. Don't trust the network, it has its fault and need some 'freedom' to live its life. Low delay require better control over the WSN, thats all.