Researchers report US data centers may need 697–1,451 MGD of new water capacity by 2030
Researchers report US data centers may need 697–1,451 million gallons per day of new peak water capacity by 2030 in a baseline scenario, even if national totals stay small. Model local peak-day water constraints, not just annual averages, when planning new clusters.

TL;DR
- A new paper argues US data centers could require 697–1,451 million gallons per day of new peak water capacity by 2030 in its baseline scenario, with the authors comparing that range to roughly New York City’s 1,000 MGD daily supply capacity estimate.
- The paper’s main warning is not national water share but local peak-day stress: according to paper summary, data centers still use a “very small percentage” of total US water, yet evaporative cooling can create sharp demand spikes that older municipal systems cannot absorb.
- The abstract shown in [img:0|paper abstract] says data centers evaporate roughly 75% of the water they draw from local public supplies, which helps explain why hot-weather cooling demand matters more than annual averages paper summary.
- A competing take in blog thread argues the broader “AI water crisis” is overstated because direct onsite water use remains tiny nationally, but even that thread concedes that individual hosting communities can face real local constraints.
What did the researchers actually model?
The researchers model three scenarios through 2030 using 2024 operator reports and public utility data, and the baseline case is the headline number: 697–1,451 MGD of new peak capacity capacity estimate. The abstract in [img:0|paper abstract] adds an optimistic case where industry-wide water-use intensity falls 10% per year, cutting the required new capacity to 227–604 MGD.
That framing matters for infrastructure planning because the paper is about capacity, not just annual consumption. As the thread notes, annual data-center water use in the baseline still reaches 60–110 billion gallons, but the operational bottleneck is whether local systems can supply cooling loads on the hottest days.
The paper also proposes a policy mechanism: the summary says operators should fund expansion of local water capacity before new server clusters connect. For engineers working on site selection, that turns water from a generic sustainability metric into a hard dependency alongside power interconnects and transmission queues.
Why are local deployments the real constraint?
The national numbers can look small while still breaking specific utility systems. the thread calls out Northern Virginia as the clearest hotspot, with similar pressure in Georgia, Indiana, Wisconsin, Iowa, and Oregon; single facilities can draw 1–8 MGD and take a large share of local public supply.
That local-versus-national split is also the core rebuttal in the opposing blog. The author, summarized in blog thread, argues 2023 direct onsite data-center use was about 50 MGD, or 0.04% of total US water use, and says AI itself was only a fraction of that. But the same thread acknowledges that “individual data centers can impact local water systems,” which lines up with the paper’s claim that planning errors show up first in host communities, not in national totals blog summary.
For operators, the engineering takeaway is simple: annual ESG reporting will miss the binding constraint if the deployment depends on evaporative cooling during summer peaks. In the paper summary, that is exactly where “aging local public water systems” run into the largest stress paper summary.