
The seven-year-long California dry spell that finished in mid-2019 paired with the wildfires that followed are only two late occasions that have projected a focus on the extensive outcomes of deteriorating water deficiencies. Those worries have been compounded by a lack of certainty of the impact of climate change on water supplies.
Under particular scrutiny is the data center market sector as, just in the U.S., data centers are expected to have consumed an estimated 174 billion gallons of water in 2020[1]. For reference, a 15-megawatt data center can use up to 360,000 gallons of water in just one day. Regulatory pressure by state and local officials has been on the rise as the concerns increase about scarcity of public resources. All of this is happening as cooling options continue to narrow due to restrictions on the use of refrigerants.
Even without environmental changes and natural disasters such as wildfires, the world is consuming water at faster rates than ever before. The global population has doubled over the past 40 years but use of water has quadrupled causing a gap between resources and demand. The Water Resources Group forecasts that global water demand may outstrip sustainable use by 40 percent as soon as 2030.
In addition to overall data center growth, demand for cooling is being driven by new applications and market trends like machine learning and cryptocurrency mining. Artificial intelligence workloads make extensive use of power-hungry graphics processing units and AI training algorithms can require days of heavy processing which requires extensive cooling demands.
Data center water use has broad impacts as it affects the quality and availability of local water supplies, particularly when groundwater is involved. Operations invite increased regulatory scrutiny and operators may have to make expensive infrastructure investments to clean recycled water to ensure the standards are met.
The role of water in data center and power consumption is often an overlooked aspect of natural resource management. Power generation alone consumes more than half the water in the U.S. Generating a single kilowatt of electricity with coal or nuclear fuel requires about 15 gallons of water. That means that the most effective way for some data centers to reduce water usage is to cut back on power consumption. The balance comes in seeking efficient technologies that reduce or even eliminate the need for water consumption. There has been a recent charge toward innovative cooling strategies such as Hyperscale and Microsoft operators. By pioneering the use of free-air cooling, submerged data centers, raising operating temperatures for server inlet air, and experimenting with new technologies like indirect air cooling, operators are advancing the sector in water use reduction.
[1] Gillin, P (2021, January 8). Tackling Data Center Water Usage Challenges Amid Historic Droughts, Wildfires. Data Center Frontier. https://datacenterfrontier.com/data-center-water-usage/
Photo Credit: Shutterstock