I fabricated TiO2 MOS capacitors using Ag/ITO as front contact, n-Si as substrate and Mg/Al as back ohmic contact and tried to pull out properties through CV measurements. While doing measurements under room temperature, the depletion region on CV curve was not distinguishable and the capacitance under low forward bias (0~+4V) was quite low. So that initially I thought the issue might come from high Rs. After doing temperature dependent CV and correcting the curves through the model proposed in the paper "MOS capacitance measurements for high-leakage thin dielectrics", I found that under high test temperature, the depletion region was way more distinguishable, which might come from the increased carrier concentration and the narrower depletion width, and the capacitance drop in 0~+4V was less than that under low test temperature. But when I compared the series resistance (Rs) and leakage resistance (Rp), I found the Rs didn't change a lot under different test temperatures, however, the leakage resistance Rp increased under high test temperature, which implied that there was less leakage current under higher test temperature. But this phenomenon is counterintuitive to me because from most of the literature, the leakage current and temperature are positively correlated. So that I am wondering is there any conducting mechanism talking about the inverse correlation between leakage current and temperature.

Attached is the data from CV test, condition A & B refer to different deposition conditions of TiO2.

More Chien-Hsuan Chen's questions See All
Similar questions and discussions