When deriving Land Surface Temperature (LST) from Landsat Collection 2 product, it is necessary to apply a scaling factor to the thermal band to convert the digital numbers (DN) to radiance values. The radiance values can then be used to derive LST using various algorithms.
To apply the scaling factor in Landsat Collection 2 product, you can follow these steps:
Open the Landsat Collection 2 product in a software application that supports remote sensing analysis, such as ENVI or ArcGIS.
Identify the thermal band in the product. In Landsat Collection 2 product, the thermal band is labeled as "Band 10" or "Band 11" depending on the Landsat satellite used to acquire the data.
Check the metadata of the Landsat Collection 2 product to obtain the scaling factor for the thermal band. The scaling factor is listed under the metadata item "RADIANCE_MULT_BAND_x", where "x" is the number of the thermal band.
Multiply the DN values in the thermal band by the scaling factor to obtain radiance values. This can be done using the raster calculator tool in ENVI or ArcGIS.
Once you have the radiance values, you can use them to derive LST using various algorithms, such as the split-window algorithm or the single-channel algorithm.
It's important to note that the scaling factor is specific to each Landsat Collection 2 product and thermal band, so it's necessary to obtain the correct scaling factor for each product and band. Applying the scaling factor at the correct step in the process is crucial for accurate LST derivation from Landsat Collection 2 product.
To derive Land Surface Temperature (LST) using Landsat Collection 2 data, you need to apply a scaling factor to convert the thermal band data (in Digital Numbers) to LST values (in Kelvin). The scaling factor is specific to each Landsat sensor and is provided in the Landsat metadata file.
The scaling factor is applied during the conversion of the raw digital numbers (DN) in the thermal band to radiance and then to LST. The scaling factor is used to convert the DN values to at-sensor radiance values. Then, the radiance values are corrected for atmospheric effects and converted to LST values.
The scaling factor is applied in the formula:
LST = K2 / (ln(K1/Lλ + 1)) - 273.15
Where:
K1 and K2 are sensor-specific constants
Lλ is at-sensor radiance
ln is the natural logarithm
To apply the scaling factor, you need to first convert the DN values to at-sensor radiance using the formula:
Lλ = (DN * scale_factor) + offset
Where:
DN is the digital number in the thermal band
scale_factor is the scaling factor for the specific Landsat sensor
offset is the offset for the specific Landsat sensor
After applying the scaling factor, you can continue with atmospheric correction and conversion to LST using the formula above.
When I use this formula, LST = K2 / (ln(K1/Lλ + 1)) - 273.15 after getting the Lλ, i get the LST values of more than 800 which is weird. I used data from Landsat 8, collection 2, and LEVEL 2. However, if I use the given scaling factor of Surface Temperature which is "Band 10* 0.00341802 + 149.0", I get the values between -16 to 57. Since I used level 2, does it mean that using the scaling will give me the LST and no need to use further algorithms?