A project I am working on is the evaluation of stigma temperature in outdoor conditions (solar radiation up to 800-900 W/m2 and air temperature varying between 10-30 degrees).

I am utilizing three different instruments,

1. Thermal Camera that can be attached to a cell phone (thermal expert Q1)

2. Type T thermocouples 32 AWG (0.008 inches in diameter or 0.3255 mm2)

3. IR thermometer

The instruments were calibrated with a certified digital thermometer.

From the experiment, I was able to understand some sources of errors for instruments 1 and 2.

First, for the thermal camera, it is difficult to get stable temperatures and as a result, every time a picture was taken, a reference object is located with the picture. The temperature of the reference object is continuously known through time by a thermocouple that is attached to it. Consequently, a correction is done where the pixels matching the reference object are compared to that of the thermocouple measurement at the exact moment when the picture was taken. For example, an IR picture gives a measurement of 10 degrees to the reference object, while the thermocouple reads 15 degrees. As a result, the correction will be equal to 5 degrees.

This correction is then applied to every pixel. Although this attempts to solve one issue, it creates others. First, the thermocouple does not always equal to the amount of the radiation that is emitted by the object. Second, the correction that is applied to every pixel is a constant (example: 5 degrees), but we have noticed that the pixels do not follow the same correction (some might need a higher correction, others less), but this method of correction is the best we’ve got right now.

Second, for the thermocouples. Thermocouples do not measure actual object temperature but their own temperatures. This is an issue for a project such as ours where it is impossible to attach a thermocouple to a stigma (because of its small size) and therefore we attempted to place the thermocouple to a close enough area to the stigma. Second, in the presence of solar radiation, the heat capacity of the metal defining the thermocouple juncture is different than that of the stigma, leading to erroneous values.

Nonetheless, when all three methods are pooled together, we notice that IR camera and thermocouples have near consistent results while the IR thermometer is nearly systematically cooler than the two other methods (of about 1.5 degrees Celsius). This is odd and difficult to explain. Also, these values for the IR thermometer always make stigma cooler than air, which would not make much physical sense as the stigmas don't have any cooling mechanisms to our knowledge. Consequently, I am wondering if anybody has had any experience with any of these three instruments in order to help me get a better understanding of what could be the issue, but most importantly which instrument is actually the best to measure temperature.

Thank you for your time,

Similar questions and discussions