We make carbon deposition from DC plasma in our machine (working pressure ~ 0.1 mbar). When we open the machine we first flood the device with pure nitrogen because the walls may be heated up from the deposition process and we want to minimize adsorption of CO or O2... on the chamber walls. 

The thing is now that the full range pressure gauge (pirani and cold cathode) stucks at 200 - 300 mbar (more or less) when atmospheric pressure is reached in the chamber but the reading on the gauge increases during about half an hour or an hour after opening the chamber (if the atmospheric pressure is held). I would not expect this to happen due to different gas species as the manufacturer (Pfeiffer) claims the same measuring behaviour with N2 as with air. 

My theory was now that maybe there are some gases from the deposition process left in the gauge (mostly methane and hydrogen) and it takes them a while to be replaced by diffusion. - My colleague, however, thinks that it might be due to a very slow temperature regulation of the pirani filament (at higher pressures). 

Has anybody seen something like that before, or does even know the reason why this is happening?

More Johannes Gruenwald's questions See All
Similar questions and discussions