The Hidden Thirst of AI: The Water Risk Behind AI Data Centers

In the rush to build the next generation of artificial intelligence infrastructure, one resource has quietly slipped into the background: water. While much attention has focused on energy consumption and computing scale, data centers are increasingly dependent on large volumes of freshwater to keep their servers cool, especially as they power AI workloads that run continuously. With many of these facilities located in drought-prone regions, the concept of water stress is emerging as a critical constraint for technology infrastructure rather than a peripheral concern.

To illustrate this, modern generative AI systems demand enormous computing power. Every additional rack of processors generates heat that must be dissipated. Engineers note that cooling systems often use evaporative or adiabatic water loops, and one estimate puts the time-averaged water use at about a single 500 ml bottle of water per 20 to 50 AI search queries. Even this modest-sounding figure becomes immense when multiplied across billions of users. The wider data center industry already sees individual facilities consuming up to five million gallons of water per day, roughly equivalent to a small town’s usage. In the United States, a single gigawatt data center can draw up to 5 million gallons of water every day for cooling. These are not isolated cases; they reflect a structural shift in how digital infrastructure and water systems interact.

Berkeley Lab research suggests U.S. data centers directly consumed roughly 17 billion gallons of water in 2023 for on-site cooling alone, according to recent industry estimates. Indirect water use, through electricity generation that supplies those centers, adds hundreds of billions of gallons more. Globally, the International Energy Agency has estimated current water use at about 560 billion liters per year, with projections reaching as high as 1,200 billion liters by 2030 if AI growth continues at its present pace. To provide perspective, a one-megawatt facility might use up to 25.5 million liters annually, which is comparable to the daily water consumption of approximately 300,000 people in a medium-sized city. As AI clusters expand, both the absolute and relative impact of water demand are rising quickly.

Cooling systems sit at the heart of this challenge. Hyperscale data centers dissipate huge heat loads, which have historically been managed through cooling towers and water circulation loops. Evaporative cooling is efficient in terms of energy use because water evaporates to remove heat, but it comes at the cost of substantial freshwater input and often produces discharge of warm concentrate. Alternatives such as air-cooled systems and liquid-immersion cooling can reduce water use, but they often require higher energy consumption or greater capital investment. Regional climate plays a major role as well. A facility sited in Phoenix, Arizona, will use far more water per megawatt of load than one in northern Europe, simply because of higher ambient temperatures and lower humidity. Some operators reduce water impact by sourcing reclaimed water or operating in cooler climates where air cooling can supplement water use. Facilities in wetter regions, such as Virginia or Ireland, can tap into outside air or treated water from local utilities rather than drawing directly from freshwater sources.

Yet the water footprint extends well beyond the cooling towers themselves. The electricity powering these data centers often originates from thermal power plants, coal, natural gas, and nuclear facilities, which consume substantial water for their own cooling systems. A thermal power plant requires water to condense steam back into liquid form after it has driven turbines, and this process can be as water-intensive as the data center cooling itself. Estimates suggest that indirect water consumption through electricity generation may rival or even exceed direct cooling water use. For a data center powered partly by thermal generation, the total water footprint could easily double when accounting for both on-site cooling and upstream power plant operations. In regions where data centers draw from grids dominated by coal or natural gas, this hidden water cost becomes especially significant. Renewable sources such as wind and solar, by contrast, require minimal operational water, making them not only carbon-friendly but also water-efficient complements to data center expansion in water-stressed regions.

Geography and local context influence how water and data centers intersect. Google’s facility in Council Bluffs, Iowa, reportedly used over 1.3 billion gallons of potable water in a recent year, equivalent to the annual use of more than 28,000 homes. In Mesa, Arizona, a campus for Meta has raised concerns among local officials because it lies in a region already struggling with water scarcity. Similar situations have emerged abroad. In Chile and the Netherlands, Microsoft faced local pushback over its data center water use and the strain it placed on nearby aquifers. These cases demonstrate how big technology companies’ siting decisions often overlap with communities that already face water shortages, creating new tensions around fairness, governance, and local resource management.

Transparency and accountability remain inconsistent. Many companies now publish detailed energy and carbon data for their operations, but water reporting lags behind. Metrics such as Water Usage Effectiveness, which measures liters per kilowatt hour, are slowly gaining adoption. Yet in most jurisdictions, there is still no standard requirement for disclosing the volume of potable water used for cooling. Some companies have announced ambitious water-positive goals. Google plans to replenish 120 percent of the freshwater it consumes by 2030, Meta has pledged to return more water than it uses, and Microsoft has made a similar commitment. These efforts are encouraging but often rely on offset projects that do not always align with local water conditions or community priorities. The absence of unified regulations, combined with complex water rights laws and limited local oversight, makes it difficult to verify how these commitments translate into practice.

New technologies and practices offer potential paths forward. Direct-to-chip liquid cooling allows coolant to contact processors directly, dramatically lowering both energy and water requirements compared to traditional tower-based systems. The use of reclaimed and non-potable water is increasing; one large operator reported in 2023 that about a quarter of its cooling water came from non-potable sources. Wastewater reuse, on-site desalination, and local treatment systems are being piloted in regions where freshwater access is limited. AI-driven workload scheduling can also shift computational tasks to facilities located in cooler or wetter areas during periods of high heat, easing strain on both grids and water systems. To make these advances stick, technology and policy must evolve together, including faster permitting for water-efficient facilities and stronger disclosure standards that make water data as visible as energy data.

The broader implications extend beyond the tech industry. As heat waves intensify, populations grow, and droughts become more severe, the relationship between digital growth and water availability will increasingly shape resilience for cities and regions. Data centers do not just draw electricity; they draw water, competing with agriculture, households, and ecosystems for a finite resource. The tension between digital progress and environmental limits is not merely theoretical. It is already unfolding, one cooling tower at a time. For those working in or adjacent to this industry, the essential question is how to scale AI and cloud infrastructure while preserving the very systems that make it possible. Can the digital future grow without deepening the planet’s thirst?

Previous
Previous

The Obscured 42%: What a Decade of Hidden Emissions Reveals

Next
Next

Digital Emissions Presents Fair Developer Score at ASE 2025 in Seoul