AI water
(Credit: Alberto Bobbera/Unsplash)

A Look At the True Resource Cost of AI Data Centers

In terms of the “unknowns” represented in the future of mass-scale adoption of artificial intelligence—its impact on human lives, the economy, and the workforce—concerns about its environmental footprint have grown in parallel. Among the most frequently repeated claims is the idea that something as small as a single sentence typed into an AI system can be equivalent to surprisingly large quantities of water, mainly due to the cooling demands of the ever-expanding data centers that power today’s most common AI models.

While it is true that the infrastructure supporting these systems can consume large amounts of water, a closer examination of the data helps dispel several common misconceptions. When resource consumption is reported accurately, the evidence suggests a reality shaped primarily by geography, infrastructure design, and scale—rather than by individual user interactions.

Keeping it Cool

At the core of the debate are massive data centers housing thousands of high-performance processors used both to train and operate large AI models. That feeling that you might be able to fry an egg on the back of your laptop after a long work day might give one a sense of just how formidable a challenge heat mitigation can be at this scale.

Many of these facilities rely on evaporative cooling systems, which dissipate heat using water, while others employ air-based cooling, closed-loop liquid systems, or more recent technological approaches such as immersion cooling. Each method has its own trade-offs in terms of efficiency, cost, and water use.

The realities of these cooling requirements have understandably raised concerns about long-term sustainability as AI workloads increase and data centers continue to expand. However, attributing large quantities of a facility’s water consumption to each simple interaction is a dramatic oversimplification of how resource consumption, at least on paper, seems to play out.

Millimeters, Not Gallons

Researchers who have attempted to estimate water usage at the level of individual AI queries have consistently reached conclusions that differ significantly from those suggested by viral claims. Most analyses distribute a data center’s total water consumption across the immense number of requests processed each day—often numbering in the millions or billions.

Using this approach, credible estimates indicate that the water associated with a single AI query typically falls within the milliliter range, in contrast to higher estimates that some claims suggest. Additional studies suggest values ranging from a few milliliters to a few dozen milliliters per query, depending on assumptions about cooling efficiency, regional infrastructure, and electricity generation.

Based on studies by scientists at the University of California, Riverside, it is true that a 100-word AI prompt can be estimated to consume the equivalent of a 519 milliliter bottle of water. Earlier figures that have placed those estimates at much higher consumption levels appear to stem from misunderstandings of how facility-level water use scales across individual computational tasks.

Why the Numbers Matter

One source of confusion surrounding AI’s water footprint lies in how water usage is calculated. Some estimates account only for direct water use, such as cooling within data centers, while others also include indirect consumption, such as water used by power plants to generate electricity.

Geography is another major factor. Data centers located in hot or arid regions typically require more water for cooling than those in cooler climates. At the same time, many newer facilities are intentionally designed to minimize freshwater use by relying on recycled water or more advanced cooling technologies.

Training large AI models further complicates the picture. Training runs are extremely energy- and resource-intensive, but they occur far less frequently than everyday inference tasks such as generating text or answering questions.

It’s About Scale, Not Sentences 

Although the water impact of individual AI queries remains relatively small, the long-term effects of large-scale AI adoption are not. Billions of daily interactions, combined with the rapid expansion of data center capacity, are driving substantial regional water demand. Focusing on water use per sentence, however, risks missing the larger issue. The environmental footprint of AI is primarily a matter of scale, efficiency, and infrastructure planning—not individual user behavior.

It is understandable that the scale of water usage required to cool modern AI facilities can be difficult to grasp, and this uncertainty has helped fuel widespread misconceptions. In an era of rapidly spreading misinformation, oversimplified claims can obscure the real challenges associated with AI’s environmental impact.

A closer look at resource management across data centers suggests that while AI’s water consumption is not as extreme as commonly portrayed, it remains significant and warrants scrutiny. As demand continues to grow, responsible infrastructure design and intelligent resource management will play a crucial role in ensuring that these systems remain sustainable and that the quality of life of the communities that share resources with these facilities remains at the forefront of the discussion.

Rather than focusing on sensational claims, a more productive conversation centers on transparency, efficiency, and how emerging technologies can reduce the environmental footprint of AI at scale.

Caleb Hanks is a freelance writer, musician, and audio engineer based in Asheville, North Carolina.