11 November 2014 21 7K Report

When looking for experimental conditions leading to the most beautiful Raman spectra of disordered carbons, I realised that the grating had a somewhat high influence on the results. Let me explain. Switching from a grating of 600 to 1200, 1800 or 2400 l/mm obviously led to a higher resolution but also to a lower signal/noise ratio. But I also observed that, doing this, the intensity ratio of the G/D bands changed, do you know why? For getting a high signal/noise ratio with an grating having a higher number of lines/mm, I used higher acquisition times until the spectra were smooth and very nice, but the relative intensities were totally different. Can it be due to local heating, because of the correspondingly high acquisition time ? What do you suggest for having reproducible spectra? I have never seen this problem reported in the literature, so I wonder if this could be due to a temperature effect, or to a calibration problem of my spectrometer.

More Alain Celzard's questions See All
Similar questions and discussions