Is the explanation that the digital equipment takes longer to propagate? For instance, a software synthesis is very slow compared to a hardware synthesis.
Well, I would say every real time signals are analog in nature. So, for analog equipment, it can directly process the raw analog input signal. But in case of digital equipment, first analog signal needs to be converted into digital and then it is processed with the help of any microcontroller circuits or any DSP circuits for any computation. So, in case of digital latency is more than analog systems.
Latency is an expression of how much time it takes for a packet of data to get from one designated point to another.
In a computer system, latency is often used to mean any delay or waiting that increases real or perceived response time beyond the response time desired. Specific contributors to computer latency include mismatches in data speed between the microprocessor and input/output devices and inadequate data buffers.
Within a computer, latency can be removed or "hidden" by such techniques as prefetching (anticipating the need for data input requests) and multithreading, or using parallelism across multiple execution threads.
Latency issues, from the user's perspective, are usually a perceived lag between an action and a response to it. In 3D simulation, for example, in describing a helmet that provides stereoscopic vision and head tracking, latency is the time between the computers detecting head motion to the time it displays the appropriate image.
Propagation, Transmission (whether optical fibre, wireless or some other), Router and other processing, other computer and storage delays are main reasons of latency in digital equipment
In the case of analogue, as pointed out by Markus all these intermediate stages are not experienced because, “Analogue is with light of speed”.
Latency in digital equipment is due to the fact that data have to be processed by some microprocessor or DSP, which takes a finite time. Moreover, many digital processing algorithms need to gather a certain amount of raw data before they can even start.
An example would be real-time audio/video compression/decompression: if your algorithm (say some MPEG variant or other) operates on, say, 150 ms of data at a time, that will be your baseline delay, to which you must add processing, transmission and propagation times. All processing must be causal.
Analogue, on the other hand, can't store data to be processed later. Apart from propagation times, analog processing is virtually istantaneous.
Amplitude with time, the original form of any signal(light,sound,electrical,electromagnetics etc.) and can be traveled with speed of light. May be fundamental (original, single frequency) or adulterated (mixture, of many frequencies), and if not can be adulterated easily(by electromagnetic disturbances).
Something should be done to keep it pure........solution is signal processing. Divided it in instanteneous amplitude and then represent it, and the process called digital signal processing. Here signal cannot be adulterated (can keep pure), samples(instantenious amplitude) can be stored, and can be represented when needed.
Only time taken would be more in representing it, or digital equipments have more latency than analog. Thus digitization can be made faster, but cannot be made, faster than analog. I used analog filter to get +ve,-ve and zero sequence component of current, from line current, to find fault type detection, for faster response, refer paper, 'Microprocessor based fault type detection and.............for alternators and power transformers.
For a simple answer, chalk it up to processing time in a microprocessor, (DSP, MCU, etc) These devices operate on an internal clock and have the ability to store data, operate on it in software, then reproduce it. Purely analog systems don't store anything.
Even in the absence of delays applied by digital filtering elements in the control computer/processor, there is the delay associated with the zero-order-hold (ZOH) which holds the controller's discrete-time command signal constant for one sampling period (otherwise there would just be an impulse followed by nothing!). So by the end of the hold period, immediately before the next command is issued, the command is 'stale' and one sampling period old. You don't get this effect in continuous-time systems. They always generate a new command instantaneously based on the current measurement of the plant's output.
There are many good answers. However, i want to add a comment concerning the latency time. Any system has dynamic performance parameters. One of this parameter is the delay time between the application of an input signal and getting a repose at the output. In some systems, such as the networks it is called latency.
The delay time can be divided in three types depending on its nature, the propagation delay time is termed also the group delay time, the processing time such as filtering and the stopping time by storing in memory. The two first types exist in both analog and digital systems while the last exists only in the digital systems where binary storing memory elements are available.
I want to say that digital systems may have larger latency but also they can be used to realize excessive delay that can not be implemented by the analog systems.
It is the memory that makes the great difference concerning the latency
Pure analogue systems have no time delay only phase. So the signal will be ~ speed of light
The simplest of logic gates will have a propagation delay, and any sampled, or system using storage, will have R/W access delays, even if its an asynchronous system
The delay time can be divided in three types depending on its nature, the propagation delay time is termed also the group delay time, the processing time such as filtering and the stopping time by storing in memory. The two first types exist in both analog and digital systems while the last exists only in the digital systems where binary storing memory elements are available.