A Bottle of Water Per Email: The Hidden Environmental Costs of Using AI Chatbots
Pranshu Verma and Shelly Tan The Washington PostChatbots use an immense amount of power to respond to user questions, and simply keeping the bot’s servers cool enough to function in data centers takes a toll on the environment. While the exact burden is nearly impossible to quantify, The Washington Post worked with researchers at the University of California, Riverside to understand how much water and power OpenAI’s ChatGPT, using the GPT-4 language model released in March 2023, consumes to write the average 100-word email.
Let’s look first at water.
Each prompt on ChatGPT flows through a server that runs thousands of calculations to determine the best words to use in a response.
In completing those calculations, these servers, typically housed in data centers, generate heat. Often, water systems are used to cool the equipment and keep it functioning. Water transports the heat generated in the data centers into cooling towers to help it escape the building, similar to how the human body uses sweat to keep cool, according to Shaolei Ren, an associate professor at UC Riverside.
Where electricity is cheaper, or water comparatively scarce, electricity is often used to cool these warehouses with large units resembling air-conditioners, he said. That means the amount of water and electricity an individual query requires can depend on a data center’s location and vary widely.
Even in ideal conditions, data centers are often among the heaviest users of water in the towns where they are located, environmental advocates said. But data centers with electrical cooling systems also are raising concerns by driving up residents’ power bills and taxing the electric grid.
Data centers also require massive amounts of energy to support other activities, such as cloud computing, and artificial intelligence has only increased that load, Ren said. If a data center is located in a hot region — and relies on air conditioning for cooling — it takes a lot of electricity to keep the servers at a low temperature. If data centers relying on water cooling are located in drought-prone areas, they risk depleting the area of a precious natural resource.
In Northern Virginia, home to the world’s highest concentration of data centers, citizen groups have protested construction of these buildings, saying they are not only loud energy hogs that don’t bring in enough long-term jobs, but also eyesores that kill home values. In West Des Moines, Iowa, an emerging hotbed of data centers, water department records showed that facilities run by companies like Microsoft used around 6 percent of all the district’s water. After a lengthy court battle, the Oregonian newspaper forced Google to disclose how much its data centers were using in The Dalles, about 80 miles east of Portland; it turned out to be nearly a quarter of all the water available in the town, the documents revealed.
Before chatbots can even fulfill a request, a huge amount of energy is spent training them. The large language models that allow chatbots like ChatGPT to generate lifelike responses all require servers to analyze millions of pieces of data. (The Post has created a tool, Climate Answers, that asks ChatGPT to summarize Post climate coverage. Because it searches a limited universe of information, its queries are designed to be less resource-intensive.)
It can take months to train these advanced computer models, according to AI experts and research papers, and tech companies such as Google, Meta and Microsoft are scrambling to build data centers.
Each one continuously churns out heat.
Big tech companies have made numerous pledges to make their data centers greener by using new cooling methods. Those climate pledges are often not met.
In July, Google released its most recent environmental report, showing its carbon emission footprint rose by 48 percent, largely due to AI and data centers. It also replenished only 18 percent of the water it consumed — a far cry from the 120 percent it has set as a goal by 2030. “Google has a long-standing commitment to sustainability, guided by our ambitious goals—which includes achieving net-zero emissions by 2030,” said Mara Harris, a spokesperson for Google.
“AI can be energy-intensive and that’s why we are constantly working to improve efficiency,” said Kayla Wood, an OpenAI spokesperson. Ashley Settle, a Meta spokesperson, said in a statement that the company prioritizes “operating our data centers sustainably and efficiently, while ensuring that people can depend on our services.” Craig Cincotta, a general manager at Microsoft, said the company “remains committed to reducing the intensity with which we withdraw resources,” and added that Microsoft is working towards installing data center cooling methods that “will eliminate water consumption completely.”
Tech companies such as Nvidia will keep creating computer chips that generate more kilowatts of power per server to do more computations, experts said. AI is creating unprecedented demands on data centers that far outpace any in recent history, Ren said.
“The rapid rise of AI has dramatically changed this trajectory,” Ren said, “and presented new challenges the industry has never met before.”
Methodology
Water and electricity costs were calculated by Ren for ChatGPT-4 at an average American data center. A full methodology can be found in their paper “Making AI Less “Thirsty”: Uncovering and Addressing the Secret Water Footprint of AI Models.”
D.C. electricity consumption figures calculated from the U.S. Energy Information Administration’s 2023 state energy reports. Rhode Island daily water use from the National Environmental Education Foundation. Beef and rice per capita consumption numbers from U.S. Department of Agriculture food availability data and the OECD’s agricultural outlook. Water footprint estimates sourced from: Livestock Science’s “Assessing water resource use in livestock production: A review of methods” and Science’s “Reducing food’s environmental impacts through producers and consumers.”