(This blog has been written with the support of Mistral.AI.)
When I started my career as a professional software developer, engineer, assistant, business consultant, and architect in 1993 (many more roles I could tell, which wouldn’t help to explain what I did), I listened carefully to Neil Postman. A genius author and cultural critic, depicting our times with the right two words: INFORMATION OVERKILL.
32 years later, I think I know what he meant and partially regret that I went for the IT business. So drained by data has the well-known connotation that too much data exhausts us and psychologically distresses us. We simply get tired (and crazy) when overwhelmed by data.
But this blog is not about the psychological damages the digital world applies to us but the physical draining due to massive data computation and generation.
Of course, I am talking about the massive expansion plans in the US by big AI players to create data centers wherever they can and the bigger they can. This includes a financial investment cycle where AI companies request data centers, and big cloud suppliers that also create their own AI models offer these data centers, which need potent chip hardware vendors for enabling the implementation of the data centers that are interested and invested from their side in AI companies.
In 2025 alone, tech giants Google, Meta, Microsoft, and Amazon are on pace to spend as much as $375 billion on data center construction and the supporting AI infrastructure. This spending, described by market analysts as an “AI arms race,” is “meeting and actually exceeding the hype”. This cycle is driven by two main forces: a domestic U.S. policy push to secure AI leadership and an international race to deploy next-generation AI models.1
Current available numbers (11/29/2025)
| Company | Investment |
| Amazon | 125 billion USD |
| Microsoft | 80 billion USD |
| Alphabet (Google) | 91-93 billion USD |
| Meta | 66-72 billion USD |
Whether this is a vicious cycle cannot yet be told, and the resemblances to the .com crisis around 2000 are applicable, but comparisons between past and present, respectively future, remain speculative.
Anyhow, this blog tries to figure out how feasible this “AI industrialization” is from a technical, financial, and ecological standpoint. Furthermore, it tries to reveal what the goals are behind an “AI industrialization” and the race for AI supremacy.
First, there is a time mismatch between the ambition to create significant AI factories within the next 18 to 36 months and the energy generation and high-voltage transmission infrastructure required to power them that takes five to ten years to permit and build.
To mitigate this gap, AI companies think they can create their own power stations close to the factory using natural gas or SMR (small modular nuclear reactors). But also that implementation takes time and requires regulation. Another idea to soften the impact on the power grid is to run the new AI factories with smart load, meaning if the power system has low demand and cheap energy is available by, e.g., renewables, the AI factories will run their intensive trainings and vice versa. Google is doing that already, using peak time of available green energy to run the training of their models.
But what drives the high demand of resources by AI compute centers?
GPUs excel at neural network calculations and training due to their parallel processing capabilities, making them highly efficient for matrix multiplication and differentiation tasks. GPUs are very focused on receiving, calculating, and transferring data, though overall less capable than CPUs. But the catch is that they require much more energy due to the high amount of data they process. And we don’t talk only about the electrical power they consume. They heat up much more than CPUs, and that requires water cooling with highly clean water.
The generative AI workloads that power this boom are exponentially more power-intensive than traditional cloud computing. This is a change in kind, not just degree.
- At the Chip Level: The NVIDIA H100 GPU, the current workhorse of AI, has a power consumption of 700 watts (W). A single GPU used at 61% utilization (a conservative estimate) consumes 3.74 megawatt-hours (MWh) per year.
- Regular CPU require (e.g., Intel Core i9-13900K, at 50% utilization): 0.55 MWh per year. This means the H100 GPU consumes about 6.7 times more energy annually than a high-end CPU under moderate workloads.
- At the Rack Level: A traditional data center rack in the early 2020s might have been designed for a 10-15 kilowatt (kW) load. Today, customers are deploying infrastructure at 100 kW per rack, and future-generation designs are being engineered for 600 kW per rack by 2027.
- At the Facility Level: A typical AI-focused hyperscale data center consumes as much electricity annually as 100,000 households. The next generation of facilities currently under construction is projected to consume 20 times that amount. 2
This power density must be understood as a baseline “IT load.” The total power drawn from the grid is even higher. For every 700W H100 GPU, additional power is required for CPUs, networking switches , and the massive energy overhead for cooling. This overhead is measured by Power Usage Effectiveness (PUE), the ratio of total facility energy to IT equipment energy. A modern facility with a PUE of 1.2, for example, must draw 120 MW from the grid to power a 100 MW IT load.
Geographic Concentration: Mapping the New Power and Water Hotspots
Data center location is not driven by proximity to population centers. It is a strategic calculation based on three primary factors:
1) the availability and cost of massive-scale power;
2) access to high-capacity fiber optic networks for low latency; and
3) access to large water supplies for cooling.
This logic has led to an extreme geographic clustering of the industry. As of late 2025, approximately one-third of all U.S. data centers are located in just three states: Virginia (663), Texas (409), and California. New key hubs are emerging rapidly in Phoenix, Arizona; Chicago, Illinois; and Columbus, Ohio.
The strain is best understood not at the state level, but at the county level, where this new gigawatt-scale load connects to the grid3.
| County | State | Operating & In Construction (MW) | Planned (MW) | Total Future Load (MW) |
| Loudoun | VA | 5,929.7 | 6,349.4 | 12,279.1 |
| Maricopa | AZ | 3,436.1 | 5,966.0 | 9,402.1 |
| Prince William | VA | 2,745.4 | 5,159.0 | 7,904.4 |
| Dallas | TX | 1,294.6 | 2,911.2 | 4,205.8 |
| Cook | IL | 1,478.1 | 2,001.8 | 3,479.9 |
| Santa Clara | CA | 1,314.7 | 552.5 | 1,867.2 |
| Franklin | OH | 1,257.4 | 483.0 | 1,740.4 |
| Mecklenburg | VA | 1,019.5 | 502.5 | 1,522.0 |
| Milam | TX | 1,442.0 | 0.0 | 1,442.0 |
| Morrow/Umatilla | OR | 2,295.5 | 101.0 | 2,396.5 |
This spending spree is part of a projected $3 trillion global investment in data centers by 2030, boosting valuations of chipmakers like Nvidia to record highs. However, the rapid, high-stakes deployment poses challenges for public planning and directly impacts consumers. Projects are often developed in secrecy—using shell companies and vague permit descriptions to avoid scrutiny—so key decisions on power and water infrastructure are made before public announcement, leaving little room for community-wide planning.
This boom is a direct cause of rising consumer bills: A 2025 ICF report projects residential electricity rates will jump 15–40% by 2030—on top of a 34% national increase from 2020 to 2025, the fastest five-year surge in recent history, with data centers as a major driver.4
The National Water Supply
The power crisis has a twin: a water crisis. The AI industry’s “thirst” is a dual-front problem, encompassing both on-site water use for cooling and a much larger, “hidden” water footprint from power generation. In the arid but high-growth regions of the American West and Southwest, this new demand is creating a dangerous, zero-sum competition for a scarce resource.
Data centers’ water use has two major impacts:
- Direct: Evaporative cooling uses 3–5 million gallons/day (like a town of 10,000–50,000 people). U.S. direct use tripled from 2014–2023.
- Indirect: Power plants (coal, gas, nuclear) consume even more water to generate electricity for data centers.
This creates a trade-off: water-efficient cooling uses more energy, and vice versa—forcing operators in water-scarce areas to choose between stressing the grid or local water supplies.
Case Study in Water Stress: The Compounding Crisis in Phoenix (Maricopa County, AZ)
Maricopa County, Arizona, is a top-three national data center hotspot, with 3.4 GW operating and another 6.0 GW planned. This boom is colliding directly with one of the most severe, long-term water crises in the nation. The region is heavily reliant on the over-allocated Colorado River and has already seen state officials limit new home construction in the Phoenix area due to a lack of provable, long-term groundwater. 5
The mitigation strategy?
Let’s focus first on the power supply crisis due to the AI boom. How do clever AI people think they can manage the problem?
AI’s 24/7 power demand outstrips intermittent renewables, pushing data centers to secure their own “firm” energy sources.
- Short-term: A natural gas boom—utilities and data centers are building new gas plants. In 2025, Babcock & Wilcox contracted 1 GW of new gas capacity for an AI data center by 2028.
- Long-term: Nuclear co-location and SMRs are now the preferred carbon-free solution. Amazon is powering a Pennsylvania data center directly from Talen Energy’s Susquehanna nuclear plant (960 MW) and partnering with Dominion Energy to deploy SMRs in Virginia, tying a $52 billion expansion to new nuclear build-outs.
Tech giants are becoming their own utilities, bypassing the grid to lock in 30–50 years of reliable, low-cost power—avoiding grid delays, price swings, and transmission bottlenecks.
How realistic is the SMR approach?
SMRs (Small Modular Reactors) are not simply scaled-up submarine reactors from the 1950s, though both share compact nuclear designs. Modern SMRs use low-enriched uranium and are optimized for civilian power, not military bursts. However, their ability to meet AI data centers’ massive, 24/7 energy demands is uncertain and delayed by major challenges:
Current Reality: Setbacks and Skepticism
- Canceled Projects: Many SMR initiatives have been halted or abandoned due to soaring costs and safety concerns. For example, NuScale—once the U.S. leader—canceled its flagship Utah project in 2023 after costs ballooned and utilities withdrew support. Other designs face similar financial and regulatory headwinds6.
- Cost Overruns: SMR electricity is currently 2.5–3 times more expensive than traditional nuclear or renewables, with first-of-a-kind plants costing $3,000–6,000 per kW (vs. $7,675–12,500/kW for large nuclear). While proponents argue costs will drop with mass production, this remains unproven at scale.
- Regulatory Hurdles: Licensing is slow and complex. Even approved designs (like NuScale’s VOYGR) struggle to attract investors or utility contracts, as risks outweigh near-term reward.
- Safety Debates: Public and expert concerns persist over new reactor designs, waste management, and proliferation risks, especially for advanced coolants (e.g., molten salt) or modular scaling.7
Potential for AI Data Centers—But Not Yet
- Theoretical Fit: SMRs could possibly provide carbon-free, always-on power (300–900 MW per plant), ideal for AI’s round-the-clock needs. Some tech giants (Amazon, Google) are betting on SMRs for post-2030 deployments, but these are long-term gambles, not immediate solutions.8
- Competing Stopgaps: Until SMRs mature, natural gas dominates new data center power projects, with nuclear’s role limited to existing plants (e.g., Amazon’s deal with Talen Energy’s Susquehanna plant) or decades-away SMRs.
- Industry Shift: Some companies now prioritize hybrid systems (solar/wind + batteries + grid upgrades) or even large conventional nuclear plants (e.g., Microsoft’s Three Mile Island revival) to avoid SMR uncertainties.9
Outlook: A Risky Bet
SMRs remain high-risk, high-reward. While they could become a backbone for AI infrastructure, their current track record—cancelled projects, cost overruns, and regulatory delays—suggests they won’t solve the near-term energy crisis for data centers. For now, gas and grid expansions are the default, with SMRs possibly emerging as a niche solution after 2030, if costs and safety issues are resolved.
How to fix the water problem?
The move to Direct-to-Chip (DLC) and immersion cooling is non-negotiable; it is the only way to cool the next generation of AI hardware. A massive positive side effect is that these “waterless” or “closed-loop” systems solve the direct water consumption problem.
Microsoft has already launched its new “zero water for cooling” data center design as of August 2024. It uses a closed-loop, chip-level liquid cooling system. Once filled at construction, the same water is continually recycled. This design saves over 125 million liters of water per year, per data center. The closed-loop system being built for OpenAI’s Michigan “Stargate” facility is similarly designed to avoid using Great Lakes water.
This technological shift is critical. It directly addresses the primary source of community opposition in water-scarce regions. However, it is not a silver bullet. By enabling more powerful and denser racks, these technologies increase the total electricity demand of the facility. In doing so, they solve the direct water footprint but may inadvertently worsen the indirect water footprint from power generation.10
Final statement
In the current situation, no one actually asks why we need this massive investment into AI. The big players will say, “Because ‘we’ need AGI” without telling us what AGI is. AGI is like the whole term intelligence, vaguely defined. So will the big players at some point tell us, “Now we have AGI,” be happy? Already now you hear even from the AI science community concern about whether AIG is feasible by current approach at all (https://techpolicy.press/most-researchers-do-not-believe-agi-is-imminent-why-do-policymakers-act-otherwise). There is a simple rule of thumb in AI. In case you increase the complexity of your model, you must also increase the complexity of your training data. But that could be the bottleneck. They have already scraped all kinds of data. I don’t talk about the quantity of data but the quality.
Meta thinks even about super intelligence. Geoffrey Hinton, who didn’t sound too optimistic in his Nobel Prize reward speech, gave a pragmatic piece of advice when thinking about super intelligence: talk about it with chickens before. They know what life is like under the control of a super intelligence.
We could do many good things with current AI without this massive planned increase of AI compute, like improving agricultural growth without using so immense chemistry.11
To rise in the clear dawn of a fatal climate crisis, such an arms race is, from my point of view, a clear sign of a suicidal species.
Citations for mobile:
- https://www.nmrk.com/insights/market-report/2025-us-data-center-market-outlook
- US data centers’ energy use amid the artificial intelligence boom …pewresearch.org/short-reads/2025/10/24/what-we-know-about-energy-use-at-us-data-centers-amid-the-ai-boom
- https://www.visualcapitalist.com/map-network-powering-us-data-centers/
- https://www.reddit.com/r/technology/comments/1ny2o3n/ai_data_centers_are_skyrocketing_regular_peoples/
- https://watercenter.sas.upenn.edu/splash/water-stress-water-scarcity
- https://en.wikipedia.org/wiki/Small_modular_reactor
- https://www.sciencedirect.com/science/article/pii/S1738573325005686
- https://introl.com/blog/smr-nuclear-power-ai-data-centers-2025
- https://www.commonfund.org/cf-private-equity/data-center-and-ai-power-demand-will-nuclear-be-the-answer
- https://www.multistate.us/insider/2025/10/2/data-centers-confront-local-opposition-across-america
- https://happyeconews.com/japans-ai-reforestation-drones/
Citations for Web:
- https://www.nmrk.com/insights/market-report/2025-us-data-center-market-outlook ↩︎
- US data centers’ energy use amid the artificial intelligence boom …pewresearch.org/short-reads/2025/10/24/what-we-know-about-energy-use-at-us-data-centers-amid-the-ai-boom ↩︎
- https://www.visualcapitalist.com/map-network-powering-us-data-centers/ ↩︎
- https://www.reddit.com/r/technology/comments/1ny2o3n/ai_data_centers_are_skyrocketing_regular_peoples/ ↩︎
- https://watercenter.sas.upenn.edu/splash/water-stress-water-scarcity ↩︎
- https://en.wikipedia.org/wiki/Small_modular_reactor ↩︎
- https://www.sciencedirect.com/science/article/pii/S1738573325005686 ↩︎
- https://introl.com/blog/smr-nuclear-power-ai-data-centers-2025 ↩︎
- https://www.commonfund.org/cf-private-equity/data-center-and-ai-power-demand-will-nuclear-be-the-answer ↩︎
- https://www.multistate.us/insider/2025/10/2/data-centers-confront-local-opposition-across-america ↩︎
- https://happyeconews.com/japans-ai-reforestation-drones/ ↩︎






