Hi everyone,
I'm trying to calculate the relative humidity using the 2-meter dew point temperature and the air temperature. I know the dew point temperature at 2 meters above the surface is a measure of the humidity of the air, and combined with temperature and pressure, it can be used to calculate relative humidity. The dew point temperature is calculated by interpolating between the lowest model level and the Earth's surface, considering the atmospheric conditions. This is from the overview of ERA5-Land hourly data.
Here’s what I understand so far:
- Dew point temperature (in °C) can be converted from Kelvin by subtracting 273.15.
- I need to calculate the actual vapor pressure (e_d) and the saturation vapor pressure (e_s) using the following formulas:
- e_d = 6.11 * 10 ^ (7.5 * T_d / (237.3 + T_d))
- e_s = 6.11 * 10 ^ (7.5 * T / (237.3 + T))
- Then, the relative humidity (RH) can be calculated as:
- RH = 100 * (e_d / e_s)
Could someone confirm if this approach is correct, or suggest any adjustments? Additionally, are there any considerations or common pitfalls I should be aware of when using these calculations?
Thanks in advance for your help!