I'm using a InVia Renishaw Raman microscope system (785nm wavelength, 600l/mm grating density) to acquire Raman images on human brain samples, attached to MgF2 slides, at 50um resolution. The system has been fully calibrated.

I'm looking at the range 1400-1660cm-1. But when I generate the images, I seem to have some line artifacts across them where there is a drop in signal. I've attached gifs for the non-normalised and normalised images, sweeping from 1400-1660.

I've attached the spectra for columns 20 and 80 as a comparison.

I used a raster scan with 1s exposure time, an exposure time any longer and I tend to hit the upper detection limit. The scans typically take around 10 hours.

Has anyone come across this before? Is it due to acquisition, or processing? The gifs make it look like there's some kind of peak shift, but I wouldn't expect this. Thank you in advance.

More Chris Taylor's questions See All
Similar questions and discussions