I am looking for a mathematical relation between data collection rate (sampling frequency) and detector noise. In general the noise increases when sampling frequency is increased. This is with special reference to typical liquid chromatography detectors such as UV-Vis photodiode array, conductivity etc. Nyquist's theorem tells us the minimum number of data points which must be collected to accurately represent the true signal. How does noise increase with sampling frequency? One can easily see this effect practically but I could not find

( A) a general mathematical relationship between data sampling frequency and noise. 

(B) this could a misconception, but does detector time constant also increase the noise? One would assume that time constant would only increase peak widths.

The attached figure is a typical figure available on the web.

Thanks.

More M. Farooq Wahab's questions See All
Similar questions and discussions