I am working with water quality monitoring sensors (Low cost and Low power consumption). For Temperature. I am confused between the two sensors. Please suggest.
The major difference between the IC temperature sensor and the thermistor is that while the IC sensor is linear the thermistor is nonnlinear. To monitor the tempearture one has to bias both sensors with aDC power supply. The IC sensor output a voltage or current is proportional to the temperature while one has to apply a voltage on the thermistor and connect a current sensing resistance in series with the thermistor. The voltage developed on the current sensing resistance will be an indicator of the thermistor temperature.
Because of this thermistor nonlinearity one has to make calibration by mcirocontroller such that one embeds the calibration curve in the memory of the micro controller. In case of linear sensor one needs only to store one value for the sensitivity of the sensor in degrees centigrade per volt or amp.
If the temperature range is limited one can linearize the response of the thermistor.
I think because of this important advantage one would choose an IC sensor for high quality sensors.
If one require cheap sensor one would use the thermistor.
Thermistors are available for a wider temperature range than semiconductor sensors.
Thermistors are available with lower tolerances. Though the tolerance "issue" can be partly resolved by calibration. But this adds some complexity.
The thermistors' nonlinearity can be an advantage: using a "given" ADC, the thermistor's output can be "tweaked" passively (resistors only) to increase temperature resolution for a limited sub-band of its operating range.
It seems to me that the most effective sensor is not an easily realized chip of the integrated circuit. The most simple crystal, consisting of a semiconductor diode and a resistor calibrated with respect to the technology, gives a linear temperature dependence over a fairly wide range (T = -60 ... + 150 C).