Full Research Article – Regular Issue
Quantifying evaporative loss from reservoirs plays a critical role in sound water-availability management plans and in reservoir management. Various methods are used to quantify reservoir evaporation; however, each method carries a degree of uncertainty that propagates to model predictions of available water within a reservoir or a reservoir network. Herein, we explore the impact of uncertainty in reservoir evaporation on model outputs of historical and future water availability throughout the five major reservoirs in the Savannah River Basin in South Carolina, USA, using four different evaporation methods. Variability in the total available water is evaluated using the United States Army Corps of Engineers (USACE) 2006 Drought Contingency Plan hydrologic model of the Savannah River Basin, which incorporates recent water-management plans and reservoir controls. Results indicate that, during droughts, reservoir evaporation plays a large role in water-availability predictions, and uncertainty in evaporative losses produces significant uncertainty in modeled water availability for extreme events. For example, the return period for an event in which the availability of water in Lake Hartwell was reduced to 50% of full pool capacity varied from 38.2 years to 53.4 years, depending on the choice of evaporation parameterization. This is a variation of 40% in the return period, depending on the choice of evaporation method.
Phillips, R. C.; Kaye, Nigel; and Saylor, John
"A Multi-Reservoir Study of the Impact of Uncertainty in Pool Evaporation Estimates on Water-Availability Models,"
Journal of South Carolina Water Resources: Vol. 7
, Article 5.
Available at: https://tigerprints.clemson.edu/jscwr/vol7/iss1/5