Relative humidity (RH) is defined in the WRT body of knowledge as the amount of moisture contained in an air sample compared to the maximum amount that the same air sample could contain at that temperature (i.e., at saturation). The WRT manual explains RH as a percentage measure on the psychrometric chart—expressing the proportion of moisture present versus what the air could hold if saturated at that same temperature.
This definition is essential because RH is temperature-dependent: as air temperature changes, RH changes even if the actual moisture content (humidity ratio) stays the same. The WRT reference emphasizes that air can hold more water vapor as temperature increases; therefore, increasing temperature decreases RH (with no added moisture), while decreasing temperature increases RH.
In restoration practice, RH is used as a practical indicator of the drying environment and a predictor of moisture behavior in hygroscopic materials. The WRT manual notes that hygroscopic materials have an equilibrium moisture content primarily determined by RH: when RH is low, materials generally lose moisture; when RH is high—especially above about 60%—materials tend to gain significant moisture, increasing the likelihood of secondary damage.
Although restorers frequently track humidity ratio (GPP) and vapor pressure to quantify drying force, RH remains a core operational measurement because it is directly readable from a thermo-hygrometer and aligns with material response risk thresholds. Consequently, RH is the correct term for the described comparison-to-maximum-at-temperature concept, and it is one of the foundational psychrometric variables used in WRT to manage drying conditions and prevent secondary damage.
Submit