Due to the wavelength-dependent penetration depth of IR light in a sample, ATR spectra exhibit differential absorption band intensities, but what might be causing (constant and sloping: linear/exponential) baselines in spectra recorded via ATR? As far as I understand, it cannot be due to scattering/interference effects, as is the case in transmission/transflection setups.

I have not been able to find a clear answer in literature, so any references are always welcome.

Thank you in advance.

Pjotr

Similar questions and discussions