I'm working with diamond like carbon thin films. In my case the psi and delta maxima are shifting towards right (lower wavelength - higher wavelength) with temperature. Kindly suggest me what kind of changes in the film leads to the shift of maxima.
Most likely your film is not changing at all. Heating any material leads to lattice expansions and consequently changes in the electronic band structure. In general absorption structures shift to the red (higher wavelengths) and have a reduced amplitude with increasing temperature.
Sir, Actually I didn't heat the deposited film. but I deposited the DLC film with different substrate temperatures. Shift in right occurs with increase in substrate temperature. Of course this is permanent shift in different films.
Then shifts in psi & delta may just be from changes in film thickness. For example, 12nm, 18nm, & 22nm films will all give slightly different psi & delta curves, with the oscillations getting closer together as the film gets thicker. A textbook I have found very useful for my analysis is "The physics of Thin Film Optical Spectra" by Olaf Stenzel. Have you begun modeling your data yet? and if so, what model are you using, and are you detecting any changes in your model parameters?
If you see oscillations in the psi and delta spectra, they might be due to interferences in the film. These interferences are sensitive to the thickness of the film, and to its n and k values. So all these parameters can affect the position of the minima and maxima. If the film is thick enough, it is feasible to extract all these parameters by doing a multi-angle measurement and a regression using a stack model. Onto which substrate have you deposited the films? Can you show a set of spectra as example?