Let the amplitude of a continuous-time (CT) signal x(t) be distributed according to a random variable (RV) X. It's entropy (info. content) is h(X). If x(t) is sampled without quantizing, the distribution of the amplitude of a sampled signal x(n) is still the same as x(t), no matter what the sampling frequency is, implying an apparent violation of sampling theorem. How to resolve this difficulty?

More Manoj G. Gowda's questions See All
Similar questions and discussions