I am trying a simple method to sense liquid level in an open tank using hydro-static principle. Though it resembles bubbler/purge method of level sensing, it avoids the need of external purging thus making it simpler. The setup consist of a hollow SS tube of particular diameter which is running till the bottom of tank. The top end of tube is attached to a U-tube manometer via a narrow pipe. Manometer show zero deflection until the liquid level in the tank touches the bottom tip of SS tube. But once the liquid level crossed the bottom tip, a particular volume of air is trapped between the liquid in the SS tube and the manometric fluid.The pressure of this trapped air increases proportionally with the rise in liquid level of tank, which shows a deflection in manometer. The Δh of manometer can be calibrated to liquid level H and the level can be measured. Now the issue is temperature compensation. The trapped air expands and contracts w.r.t ambient temperature. For a particular liquid level H, the deflection in manometer is constant provided a constant ambient temperature. If the temperature is increased the pressure also increased showing a positive deflection in Δh and vice verse. This variation Δh vs temperature is linear, but the slope is different for different liquid level H, so a direct compensation is not possible. With the change in temperature the volume, pressure and density of trapped air is changing. I am stuck up with this problem, how to compensate level w.r.t. change in temperature and I am not getting the right way to continue.

Similar questions and discussions