News

Most people understand relative humidity, but the dew point is a much better measure of how humid it really feels.
KNOXVILLE, Tenn. — As we head into the spring and summer, meteorologists are going to talk more about humidity and dew point. But, what’s the difference? You might be used to checking the ...
That ia because humidity is temperature dependent, dew point is not. So, it's a lot easier to follow and it's a lot more of an accurate measurement of moisture in the air.
According to the National Weather Service, the dew point is the temperature where the air needs to be cooled to, at constant pressure, in order to achieve a relative humidity (RH) of 100%.
The dew point, on the other hand, is the temperature that the air must be cooled to in order for relative humidity to reach one hundred percent.
Relative humidity and dew point are often mistaken for the same thing and can be sources of confusion. But what are the differences between the two, and which is more relevant in everyday life?
Relative humidity and dew point are often mistaken for the same thing and can be sources of confusion. But what are the differences between the two, and which is more relevant in everyday life?
The dew point, on the other hand, is the temperature that the air must be cooled to in order for relative humidity to reach one hundred percent.
When our Storm Track 8 Weather Team talks about muggy conditions, they usually refer to the dew point rather than humidity. The dew point is typically a better measure of the atmosphere.
The dew point typically changes much more slowly compared to relative humidity, unless a front is passing through or precipitation starts to fall in a dry air mass.
The dew point is the temperature that we would have to cool the air down to for condensation (or dew) to begin forming.
Every summer, I get asked this question: "Why don't you use relative humidity? I don't care about the dew point." The thing is - you should care about the dew point more.