AI#028 · December 23, 2025 · 5 min read

The AI Buildout Has an Energy Problem

The conversation about AI infrastructure focuses on chips, data centers, and capital expenditure. The constraint that will actually determine the pace of AI buildout is simpler and harder to solve: electricity. The US power grid is not built for what the AI industry wants to do with it.


The scale of power demand

A large AI training run uses roughly as much electricity as a small city for several months. Inference (running models at scale for users) is a constant, growing load. Microsoft, Google, Meta, and Amazon collectively plan to spend over $300 billion on data center infrastructure in the next three years. Most of that capital is going into facilities that will consume staggering amounts of power.

The US Department of Energy estimates that data center power consumption will triple by 2030 compared to 2023 levels. That represents a new load roughly equivalent to adding the entire state of Texas to the grid. The grid, which has been slowly decarbonizing, will have to absorb this demand while managing the intermittency of renewable energy sources.

The grid is not ready

US electricity transmission infrastructure is aging and insufficient. Interconnection queues (the waiting list for new power projects to connect to the grid) now stretch to 10 years in many regions. New data centers are being announced faster than the transmission capacity to serve them can be built.

This is creating a bifurcation in where AI infrastructure can be built. Regions with surplus power (parts of the Midwest, the Southeast, certain Western states) are seeing enormous data center investment. Coastal tech hubs that are constrained on power and land are losing the race for physical AI infrastructure even as they retain the talent. The geography of AI is being determined as much by kilowatts as by code.

Nuclear as the answer nobody expected

The most interesting response to the power constraint has been a serious, unprecedented corporate push into nuclear energy. Microsoft signed a deal to restart the Three Mile Island nuclear plant. Google contracted for power from multiple small modular reactor projects. Amazon has made nuclear commitments across its data center portfolio.

Nuclear's appeal is straightforward: it provides 24/7 carbon-free baseload power at predictable prices, with a physical footprint far smaller than equivalent solar or wind capacity. Small modular reactors, if they reach commercial deployment at the projected costs, could be transformative for the AI power equation. Whether that happens on a timeline relevant to the current buildout is genuinely uncertain.

XLinkedIn

← Previous
What Software Engineering Looks Like When AI Writes the Code
Next →
AI Is Changing Drug Discovery Faster Than Anyone Expected

Enjoyed this issue?

Get the next one in your inbox.

Free, weekly, and worth your five minutes.

Preferences

No spam. Unsubscribe anytime.