This is an optical experiment. I have a SR830 lock-in amplifier connecting to a photodiode, which measures the reflection from a sample. The intensity of a laser beam is modulated by a chopper with a reference input to the lock-in amplifier. A phase is detected in the reflection signal both from the sample and control. In another part of my setup, the polarization of the laser beam is further modulated by a PEM. A second lock-in amplifier with a reference frequency from the PEM is used to measured the signal reflected from the sample. A relatively large phase is observed both from the sample and control.
My questions are as followed.
1) Should I press "auto-phase" in the second lock-in amplifier and maximize the detected signal in the in-phase term X?
2) As I want to see the difference between the target sample and control, should I first do "auto-phase" in the control and then measure the target sample? Sort of background subtraction.
3) Now, the properties of the sample is tuned by an application of an electric field / a magnetic field. This further increases the phase difference between the signal and reference, indicating a change in sample properties. Here I want to see how the reflection signal changes with an electric field / a magnetic field on the sample. I was wondering I should use the same phase shift measured in #2 (at zero electric field / magnetic field). However, it results in a very large phase shift when I measure my sample even at zero field. What is the general rule of thumb for the phase shift in intensity and polarization modulation optical experiment?