Difference Between Dew Point and Humidity

Main Difference – Dew Point vs. Humidity

Humidity and Dew Point are both units used to express the amount of water vapour present in the air. The main difference between dew point and humidity is that humidity measures the amount of water vapour in the atmosphere whereas he dew point measures the temperature at which dew can begin to form.

What is Humidity

Humidity is a measurement of the content of water vapour in the air. It can be expressed as absolute humidity or as relative humidity. Of these, relative humidity is more often used in weather reports etc.

Absolute Humidity

Absolute humidity is defined as the mass of water vapour per unit volume of air (including the volume of water vapour present in a sample).

Relative Humidity 

Relative humidity refers to how much water vapour is present in air, compared to how much water vapour could be present if air was saturated with water vapour. It can be defined as the partial pressure of water vapour in a sample of air at a given temperature compared to the saturation vapour pressure of water at the same temperature. This ratio is most often expressed as a percentage. For instance, fog forms when relative humidity is close to 100%.

Difference Between Dew Point and Humidity - Fog and Humidity

Fog forms when relative humidity is close to 100%

Humidity is measured using hygrometers. Traditionally, hair tension hygrometers had been used, which consists of a piece of human or animal hair connected to a dial. The length of hair changes with humidity and this change in length is amplified and used to display the humidity on a scale. At present, it is common to use electronic sensors to measure humidity.

Difference Between Dew Point and Humidity - Hair Hygrometer

Hair Hygrometer that was used to measure humidity

When the relative humidity of air is high, sweat evaporates from the skin more slowly and this could be uncomfortable. On the other hand, if relative humidity is too low, the air could feel quite “dry”.

What is Dew Point

Dew point is the temperature at which water vapour in a given sample of air at atmospheric pressure reaches maximum saturation (100% relative humidity). At this temperature or below, the water vapour can condense to form dew (and hence the name dew point). Dew or frost is seen in the mornings because it forms overnight when temperatures are low and below the dew point. In clouds, the temperature must reach dew point if rain is to form. Frost point is a related term, which describes the temperature at which frost begins to form (this happens when water vapour reaches saturation at a temperature below freezing point).

Difference Between Dew Point and Humidity - Dew

Dew forms when the temperature is below dew point

Dew point can be measured using a dew point hygrometer which uses a mirror, whose surface temperature is changed and the temperature at which dew begins to form is measured. The lower the dew point, the lower is the relative humidity.

What is the difference between Dew Point and Humidity

What it Represents

Humidity is a measurement of the amount of water vapour in the air.

Dew point expresses the temperature at which a relative humidity of 100% would be reached and dew would begin to form.

Values

Humidity is measured in units of either mass per volume for absolute humidity (e.g. kg m-3) or as a percentage for relative humidity.

Dew Point is measured in units of temperature (e.g. oC)

 

Image Courtesy:
(Untitled) by PRO8 og (Own work) [CC BY 2.0], via flickr
“Haar-Hygrometer, Made in GDR” by PDaniel FR (german wikipedia, original upload 20. Oktober 2004 by Daniel FR) [Public Domain], via Wikimedia Commons
“An artistic expression of form and color graced by water drops fashioned from dew.” by William Waterway (Own work) [CC BY-SA 3.0], via Wikimedia Commons

About the Author: Nipun