The Moment That Started a Firestorm
February 2026. The India AI Impact Summit in New Delhi. Sam Altman, CEO of the company behind ChatGPT, steps up to a panel and drops a statement that would ricochet across every newsroom and social media feed on the planet:
"Water is totally fake. It used to be true. We used to do evaporative cooling in data centres, but now we don't do that."— Sam Altman, interview with The Indian Express at India AI Impact Summit, February 2026
He doubled down, calling claims that ChatGPT uses gallons of water per query "completely untrue, totally insane" with "no connection to reality."
Within hours, the clip went viral. Environmental groups fired back. Researchers who had spent years quantifying data center water usage were stunned. Community activists in Oregon, Arizona, and Alabama — people who had watched their local water supplies get redirected into server farms — felt gaslit by one of the most powerful figures in tech.
But here's the thing: Altman wasn't entirely wrong. And he wasn't entirely right. As a data center engineer with 12+ years in critical infrastructure, I've operated cooling towers, specified chiller plants, and watched real-time water meters tick upward. The truth about AI's water footprint is more nuanced than either side admits — and far more consequential than either side wants to confront.
This is a fact-check. Not an opinion piece. Every claim gets weighed against peer-reviewed research, corporate sustainability reports, and on-the-ground data from communities living next to these facilities.
Claim #1: "The Water Thing Is Mostly Fake"
Verdict: Mostly False
U.S. data centers consumed approximately 17 billion gallons of water in 2023. Projections show this reaching 68 billion gallons by 2028 — a 4x increase driven primarily by AI workloads. These are not "fake" numbers; they come from peer-reviewed research published in leading scientific journals.
Let's start with what we actually know.
In 2024, researchers at UC Riverside led by Professor Shaolei Ren published a landmark study in the journal Joule — one of the most prestigious energy research publications in the world. Their peer-reviewed findings established that U.S. data centers consumed approximately 17 billion gallons of fresh water in 2023. To put that in context: that's roughly equivalent to the entire annual water consumption of a city of 300,000 people.
The same research team, along with independent analysis from the International Energy Agency and Xylem/Global Water Intelligence, projects data center water consumption reaching 68 billion gallons annually by 2028. That's not speculation — it's a trajectory based on current construction pipelines, announced AI infrastructure buildouts, and the cooling requirements of GPU-dense computing.
Why Water? The Engineering Reality
For anyone who hasn't operated inside a data center, a brief explanation of why water matters. Data centers generate enormous amounts of heat. Every watt of electricity consumed by a server eventually becomes heat that must be removed. There are three primary ways to reject that heat:
- Evaporative cooling towers — Water absorbs heat and evaporates. Extremely efficient (PUE as low as 1.1), but consumes large volumes of fresh water. Currently used by approximately 56% of data centers globally.
- Air-cooled chillers — Fans blow ambient air over condenser coils. No water consumed, but less energy-efficient (PUE 1.3-1.5) and limited in hot climates.
- Closed-loop liquid cooling — Coolant circulates in sealed systems (including direct-to-chip and immersion cooling). Minimal water consumption but higher upfront cost. The emerging standard for AI/HPC facilities.
The problem? The vast majority of existing data center capacity — the facilities actually running ChatGPT queries today — were built with option #1. Evaporative cooling towers. Water-hungry by design.
The Installed Base Problem
Even if every new data center built from today forward used zero water, the existing installed base of water-cooled facilities will continue operating for 15-25 years. The water problem isn't a legacy issue that's fading away — it's growing, because AI is driving unprecedented demand on facilities that were designed for traditional compute workloads.
Claim #2: "Newer Data Centers Don't Use Much Water"
Verdict: Partially True, But Misleading
Some purpose-built AI facilities do use closed-loop cooling with minimal water. But the majority of new data center capacity coming online still uses evaporative cooling, and even "water-efficient" designs often undercount water used in power generation upstream.
This is where Altman has a legitimate point — and where the nuance matters.
Companies like Meta, Microsoft, and some hyperscalers are indeed building newer facilities with air-cooled or closed-loop designs. Microsoft's facility in Quincy, Washington uses Columbia River water in a closed-loop system. Meta's Prineville, Oregon campus uses an innovative reclaimed-water system. Some newer GPU-dense facilities are deploying direct-to-chip liquid cooling that dramatically reduces water consumption per megawatt.
But there's a critical distinction Altman glosses over: the majority of new data center capacity being built in 2025-2026 still uses evaporative cooling. Why? Because it's cheaper to build, faster to deploy, and proven at scale. When you're racing to bring 500 MW of AI capacity online before your competitor, you reach for the cooling technology you know works — and that's usually cooling towers.
The Upstream Water Blind Spot
There's another dimension most discussions miss entirely: water used in electricity generation. Thermoelectric power plants — coal, natural gas, and nuclear — use massive amounts of water for steam generation and cooling. When a data center draws 100 MW from a gas-fired power plant, the water consumed at the power plant to generate that electricity dwarfs the water used for cooling at the data center itself.
A 2024 study estimated that when upstream power generation water is included, the total water footprint of data centers roughly doubles. So even a "water-free" air-cooled data center is still responsible for significant water consumption through its power demand — unless it runs on solar or wind.
Claim #3: "Old Statistics Are Misleading"
Verdict: True — Some Viral Claims ARE Misleading
The viral claim that "a single ChatGPT query uses 17 gallons of water" is indeed misleading. That figure conflated total facility water usage with per-query attribution in a way that overstated individual query impact. The actual per-query water footprint is much smaller — but still not zero.
Credit where it's due. Altman is right about this one — partially.
In 2023 and 2024, a statistic went viral claiming that a single ChatGPT query consumed "17 gallons of water." That number originated from a misinterpretation of research data. The actual figure, per UC Riverside's peer-reviewed work, is approximately 500ml (about one standard water bottle) per 10-50 ChatGPT responses. For a single query, the water footprint is roughly 10-50ml — significant at scale, but nothing close to 17 gallons.
Altman himself provided a specific figure in his June 2025 blog post "The Gentle Singularity": 0.000085 gallons per query — roughly 0.3ml, or one-fifteenth of a teaspoon. Google later disclosed a similar figure: 0.26ml per median Gemini text prompt. However, these figures measure only direct data center cooling, excluding water used to generate the electricity powering those servers. Academic research that includes upstream water estimates the true figure at approximately 1.2ml per query — nearly 4x the company-provided number.
The viral misstatement did real damage to the credibility of legitimate water concerns. It gave tech companies an easy target: "See? They're making up numbers." And when you can discredit one statistic, you cast doubt on the entire argument.
But here's what Altman doesn't say: the corrected numbers are still deeply concerning when you multiply them by the scale of AI usage.
The Scale Math That Actually Matters
| Metric | Value | Source |
|---|---|---|
| ChatGPT daily active users | ~200 million | OpenAI (Feb 2026) |
| Average queries per user per day | ~10-15 | Industry estimates |
| Water per query (corrected) | 10-50ml | UC Riverside (peer-reviewed) |
| Daily water for ChatGPT alone | ~530,000-4M gallons | Calculated |
| Annual water for ChatGPT alone | ~194M-1.5B gallons | Calculated |
And ChatGPT is just one AI product from one company. Add Claude, Gemini, Copilot, Midjourney, and the hundreds of enterprise AI applications running across millions of GPU-hours daily, and the numbers become staggering — even using the corrected per-query figures that Altman prefers.
The Corporate Reports Altman Doesn't Mention
If AI water usage is "mostly fake," someone forgot to tell Microsoft and Google's own sustainability teams.
Microsoft: 34% Water Increase in One Year
Microsoft's 2024 Environmental Sustainability Report revealed that the company's global water consumption rose 34% from 2021 to 2022 — from 4.7 billion liters to 6.4 billion liters. The report directly attributed this increase to "growth in AI research and cloud computing." For fiscal year 2023, the number climbed further to approximately 7.8 billion liters — an 87% increase from 2020.
Microsoft is OpenAI's largest investor and primary infrastructure partner. The GPUs running ChatGPT sit predominantly in Microsoft Azure data centers. When Microsoft's own sustainability report documents a 34% spike in water consumption driven by AI, calling water concerns "fake" is difficult to reconcile.
Google: 20% Increase, Linked to AI
Google's 2024 Environmental Report showed water consumption reaching approximately 6.1 billion gallons in 2023 — a figure that has more than tripled since 2016. Ninety-five percent of Google's water goes to data centers. Their single largest facility, in Council Bluffs, Iowa, consumed 1 billion gallons alone in 2024. Google explicitly noted that AI workloads contributed to this growth, particularly at facilities in water-stressed regions.
Both companies have pledged to become "water positive" by 2030 — replenishing more water than they consume. But the gap between current consumption trends and those targets is widening, not narrowing.
The Communities Nobody Asked
Statistics are abstract. The people living next to these facilities are not.
The Dalles, Oregon — When Google Drinks 29% of Your Water
The Dalles is a small city of roughly 16,000 people on the Columbia River. Google built its first data center there in 2006, attracted by cheap hydroelectric power and abundant water. By 2022, Google's data center complex was consuming approximately 29% of the city's total water supply.
Residents noticed. Water rates increased. The city had to negotiate complex water rights agreements. When Google applied for expansion permits that would further increase water demand, community opposition was fierce. Google eventually agreed to invest in local water infrastructure — but the fundamental tension remained: a trillion-dollar company was competing with a small-town community for the same finite resource.
Mesa and Chandler, Arizona — Data Centers in a Desert
Arizona is experiencing a historic megadrought. Groundwater levels are declining. The Colorado River — the state's primary water source — is at record lows. Into this environment, data center developers proposed multiple campuses across the Phoenix metropolitan area.
In Mesa, community groups organized against a proposed data center project, citing water scarcity concerns. In Chandler, residents raised alarms about the cumulative impact of multiple facilities all drawing from the same stressed aquifer. In Goodyear, a proposed large-scale campus faced opposition from a community already under mandatory water restrictions.
The irony is sharp: the same AI systems that could help optimize water management are being powered by facilities that strain water supplies in drought-stricken communities.
Bessemer, Alabama — The $14.5 Billion Question
In 2024, a consortium announced plans for a $14.5 billion data center campus near Bessemer, Alabama. The project promised jobs, tax revenue, and economic transformation for a region that needed it. But it also required 2 million gallons of water per day — equivalent to the usage of roughly 6,700 households, or about two-thirds of Bessemer's population — from a municipal system already dealing with aging infrastructure.
The Warrior River Water Authority said it could not provide the requested volume without "significant upgrades" to the existing water system. A Yale biologist warned the project could risk extinction of the Birmingham darter, a newly identified fish species. The mayor, chief of staff, city attorney, and council members all signed NDAs with the developer — creating a transparency gap that infuriated residents. The city council approved the project anyway, 5-2.
The Pattern
Across every case — Oregon, Arizona, Alabama — the pattern is identical. Data center developers arrive with promises of economic benefit, negotiate favorable water rates, and leave communities to absorb the infrastructure costs and supply constraints. Calling these concerns "fake" is not just inaccurate — it's dismissive of people's lived experience.
What the Researchers Say
The academic community's response to Altman's statement was swift and pointed.
Professor Shaolei Ren, UC Riverside — the researcher whose peer-reviewed work forms the basis of most AI water consumption estimates — has consistently emphasized that while viral exaggerations should be corrected, the underlying trend is real and accelerating. His research team continues to refine per-query water estimates and has documented how AI training runs (not just inference) consume enormous water resources.
The Brookings Institution published an analysis noting that data center water consumption is a legitimate policy concern, particularly in water-stressed regions, and that industry self-reporting consistently understates actual water use by excluding upstream power generation water.
The Ceres "Drained by Data" report (September 2025) found that 32% of data centers nationwide are in high or extremely high water-stress areas, and that nearly two-thirds of new U.S. data centers built since 2022 are located in water-stressed regions. Their analysis projected that Phoenix-region water demand from data center electricity alone will increase by 400% — enough to supply Scottsdale, Arizona (240,000+ people) for over two years.
More than 230 environmental organizations signed an open letter in 2025 calling for mandatory water usage disclosure by data center operators — a practice that remains voluntary in most U.S. jurisdictions. The letter specifically cited the growing gap between industry claims of water efficiency and the reality documented in corporate sustainability reports.
The Full Verdict Table
| Altman's Claim | Verdict | Evidence |
|---|---|---|
| "Water concerns are mostly fake" | MOSTLY FALSE | 17B gallons (2023), projected 68B by 2028. Peer-reviewed in Joule. |
| "Newer data centers don't use much water" | PARTIALLY TRUE | Some new builds use closed-loop, but majority still evaporative. 56% industry-wide. |
| "Old statistics are misleading" | TRUE | The viral "17 gal/query" was wrong. Actual: 10-50ml/query. Still significant at scale. |
| Implied: "AI's water impact is negligible" | FALSE | Microsoft +34%, Google +20% water YoY. Both attribute to AI growth. |
| Implied: "Communities aren't affected" | FALSE | The Dalles (29% city water), Mesa/Chandler (drought), Bessemer ($14.5B conflict). |
What Actually Needs to Happen
The path forward isn't about choosing sides between "AI is destroying water" and "water concerns are fake." Both positions are wrong. Here's what the engineering reality demands:
- Mandatory water disclosure. Every data center above 5 MW should be required to publicly report annual water consumption — including indirect water from power generation. Voluntary reporting has proven inadequate.
- Water-stressed region restrictions. New evaporative-cooled data centers should face stricter permitting requirements in regions classified as water-stressed by the U.S. Drought Monitor. Air-cooled and closed-loop alternatives exist — they're just more expensive.
- Per-query transparency. AI companies should publish verified per-query water footprint data for their models. If the numbers are as small as Altman claims, transparency should be welcome, not resisted.
- Accelerate the cooling transition. The industry is moving toward closed-loop and direct-to-chip cooling, but not fast enough. Financial incentives — tax credits for water-efficient cooling, surcharges on evaporative systems in stressed regions — would accelerate adoption.
- Community consent, not just permits. Data center developers should be required to conduct genuine community engagement — not just check regulatory boxes — before drawing on municipal water supplies. The people who live there deserve a seat at the table, not a press release.
The Engineer's Bottom Line
I've spent over a decade inside data centers. I've watched cooling towers consume water at rates that would make a farmer wince. I've also seen the industry make genuine progress — closed-loop systems, direct-to-chip cooling, heat reuse projects that are technically elegant and genuinely water-efficient.
Both things are true simultaneously. The industry is improving. And the current situation is a real problem.
What's not helpful is a tech CEO standing on a global stage and calling the concerns of researchers, communities, and his own company's sustainability reports "mostly fake." That's not engineering. That's PR.
The water data isn't fake. The communities aren't fake. The 34% year-over-year increase in Microsoft's water consumption isn't fake. And the 68 billion gallons projected by 2028 won't be fake either — unless the industry takes the problem seriously enough to actually solve it.
"In engineering, we don't dismiss data because it's inconvenient. We measure, verify, and act. That's the difference between an engineer and a spokesperson."— The principle that should guide this conversation
Sam Altman is brilliant at building AI. He's brilliant at narrative. But when it comes to water, the data speaks louder than any talking point. And right now, the data is saying: this is real, it's growing, and dismissing it as "fake" only delays the solutions we actually need.
The sources cited in this article include peer-reviewed research published in Joule, corporate sustainability reports from Microsoft and Google, water utility records from The Dalles, OR, and reporting from The Washington Post, AP News, Reuters, and The Guardian. All statistics are sourced from primary data, not social media claims.
Interactive Water Calculators
Use these tools to explore the actual water footprint of AI systems. All three calculators are built from peer-reviewed data, corporate sustainability reports, and engineering references cited throughout this article.
Three perspectives on AI's water footprint: personal usage, data center operations, and everyday comparisons.
Disclaimer: These calculators are for educational and estimation purposes only. Water consumption varies by specific hardware, workload patterns, ambient conditions, and facility design. Figures are based on peer-reviewed research (Li et al., Joule 2023), corporate sustainability reports (Microsoft 2024, Google 2024), and industry benchmarks (ASHRAE, Uptime Institute).
All calculations are performed entirely in your browser. See our Privacy Policy and Terms & Disclaimer.