I created a system that continuously classifies the system's state by reading sensor data (time series data) connected to this device. The goal of this effort is to reduce bandwidth and latency by utilising device computing via the cloud, fog, and edge. So kindly advise me on how to plot, or rather, what should be plotted, to demonstrate why on-device computing is more efficient than cloud, fog, or edge computing in terms of bandwidth and latency. Also, please share any similar resources. Thank you.

More Bidyut Saha's questions See All
Similar questions and discussions