Among giant technology companies, Amazon has historically been conspicuously quiet about its strategy for water consumption. That changed in late November, when the Seattle company’s Amazon Web Services operation — its biggest profit engine — declared a plan to become “water positive” by 2030.
There are four components to its strategy: using sensors and real-time data to analyze water consumption and to quickly identify issues, such as leaks; opting for more sources that reduce the draw on potable water, including recycled water and harvested rainwater; embracing schemes that send “spent” water back to communities for usages such as irrigation; and investing in water replenishment projects through partners such as Water.org in places such as Brazil, India, Indonesia and South Africa.
The commitment mirrors those of Amazon’s cloud services rivals Microsoft and Google, and many strategies its team is using to improve water efficiency on site in its data centers, such as investments in water recycling systems, are also being talked up by the other two companies.
But Amazon Web Services is unique for how it’s planning to report on progress: Instead of opting for volumetric disclosures, it is emphasizing its water usage effectiveness (WUE), a decade-old metric from data center standards organization The Green Grid that measures the ratio of water used for cooling the IT equipment and for on-site energy generation against the amount of electricity used to run the facility. (WUE is a sibling of power usage effectiveness, or PUE, which is used to measure the ratio of electricity used for running servers and other gear against the electricity used for cooling technology.) According to various academic studies, the average WUE across the data center industry is about 1.8 liters of water per kilowatt-hour. In 2021, Amazon reported a global water efficiency result of 0.25 liters/kWh.
Unlike Microsoft and Google, Amazon doesn’t disclose the overall on-site water consumption for the Amazon Web Services data centers, nor does it plan to do so, according to Will Hewes, water sustainability lead for Amazon Web Services, responsible for both the efficiency work and for its selection of replenishment projects.
We really think WUE is the best metric to measure. By normalizing it to the size of your business, it gives you a sense of how effectively you’re using water within your operations.
“We really think WUE is the best metric to measure. By normalizing it to the size of your business, it gives you a sense of how effectively you’re using water within your operations. Otherwise you could be a small organization that says, ‘We’re not using very much water,’ but relative to the size of your business, you actually may be really inefficient.”
Amazon Web Services plans to update its WUE performance on an ongoing basis (although not for individual data centers), and Hewes said it will strive for ongoing reductions.
So far, Microsoft and Google are choosing to focus on disclosing the overall volume of water withdrawn by their companies. For perspective, Google withdrew 6.3 billion gallons during the 2021 fiscal year, according to its latest environment report. Back in 2017, it was withdrawing about 3 billion gallons. Microsoft expresses its overall water usage differently: it reported total consumption of 4.5 million cubic meters. As already noted, Amazon isn’t publishing that number, although the web page for the company’s water progress reports that the company will be replenishing 2.4 billion liters of water annually once all of its current replenishment projects are complete. One of these projects is in Oregon, where Amazon Web Services sends 96 percent of the “spent cooling water” from its data center into a farming irrigation system.
Earlier this year, Microsoft began reporting on the WUE for its Azure Cloud services: Its global result is 0.49 liters/kWh, slightly higher than its design goal. Google apparently doesn’t publish its WUE measure as part of its annual environment report, so I can’t offer a comparison. At least I couldn’t find it in the latest one. A side note: A European data center initiative, the Climate Neutral Data Center Pact, calls for data centers operating at full capacity in areas with water stress, and that use potable water, to be designed for a maximum WUE of 0.4 liters/kWh by Jan. 1, 2025.
“There’s probably a bigger range in WUE than there is in PUE, because there are fundamentally really different cooling technologies” that data center operators can select, which affect that ratio, Hewes said. “At 0.25, that’s beyond good. That’s the best that I’ve seen in the industry. What does good look like? If you’re a company that’s using a lot of cooling towers, you’re more in that 1 to 2 liter/kWh. The vast majority of our builds are the really efficient direct evaporative design. And those are often in the 0.4 liter/kWh range.”
Hewes said a growing number of Amazon Web Services customers have begun asking about its water consumption. “We expect that it’s gonna be motivating to customers and especially the increasing number of customers that we see asking about water sustainability, in addition to carbon and renewables where we also have goals.”
It’s all about keeping your cool
How much water a particular data center sips or guzzles is dependent on the equipment or facility design approach, which can vary dramatically based on location. In cooler climates such as Ireland and Sweden, for example, Amazon Web Services data centers don’t use water for cooling for almost 95 percent of the year, instead using outside air to keep the servers and such from overheating. In Brazil, two data centers are using rainwater harvesting to supply a portion of the water used in the cooling equipment, cutting down on the amount of potable water that the company withdraws from the local watershed.
Evaporative cooling is Amazon Web Services’ “preferred cooling strategy” for data centers — kind of like a “swamp cooler” used for homes. The approach works by pulling in hot air from outside the facility and pushing it through water-soaked cooling pads. When the water evaporates, the air cools and is sent into the server halls.
When Amazon Web Services’ Santa Clara, California location switched to direct evaporative cooling, it reduced the facility’s annual water use by 85 percent. The site also uses recycled and reclaimed water instead of potable water.
According to Amazon’s water stewardship page, it uses recycled water to supply 20 data centers today. Hewes hopes to increase that number. “We would use recycled water everywhere, if it were available,” he said. (Amazon Web Services doesn’t publicly disclose how many data centers it operates globally.)
Arranging for a data center to use recycled water isn’t a trivial undertaking — it’s part of a multi-year planning process. And as with many decisions, there are tradeoffs. For each new facility or expansion, Amazon Web Services runs assessments related to water supply that run parallel to those for electricity. In some cases, those evaluations rule out the use of water as a cooling strategy. In places where a municipality has already developed a water recycling program, Amazon Web Services will work with agencies to invest in a separate distribution system — a process embedded into the company’s overall expansion plans.
What happens if a wastewater recycling strategy doesn’t exist or is less mature? In some water-scarce places, Amazon has stepped in with other businesses and community organizations to encourage development of an entire system, by demonstrating to a municipality that companies are willing to co-invest and become offtakers of recycled and reclaimed water for their operations.
“[Water recycling initiatives] really do create a long-term, more reliable and more resilient water supply,” Hewes said. “When there’s a drought, people still make wastewater. To the extent that we can help support building out those types of systems, especially in water-scarce areas, we think it’s the right decision for the business as well as for the community.”
If you’d like to read the original source of this article please click here Visit Source