“There’s a better way to do it – find it.”
— Thomas Edison
In 1882, Thomas Edison flipped the switch on America’s first electric grid, lighting up a square mile of Lower Manhattan with 100 kilowatts of power. This was a turning point in modern history.
Today, a new wave of technology is driving a fresh wave of innovation. Artificial Intelligence (AI) has the potential to reshape our economy, boost productivity, and accelerate solutions to the world’s toughest problems.
However, the extraordinary transformative potential of this technology must be tempered by an understanding of its costs. Behind the scenes of every chatbot, language model, and AI inference engine lies a data center humming with servers, cooling systems, and enormous electrical demands.

Driven by growth in AI models, a two-decade run of flat electricity demand in the U.S. is over. We’re designing systems to serve AI models that can consume gigawatts. Today’s power grid wasn’t built for this.
AI and the Grid: A tipping point for U.S. Energy
The risks are real. At the Schneider Electric™ Sustainability Research Institute, our latest research on AI explores a critical inflection point—one where the future of AI and the future of energy must now be considered as one and the same.
We predict that by 2030, power demand could exceed current grid capacity by over 29 gigawatts. AI alone could account for 20 to 50 percent of that growth. Regions like Texas, Northern Virginia, and California are already feeling the pressure, and seven major U.S. grid areas face potential reserve shortfalls by 2028.
The challenge isn’t abstract or distant. It’s unfolding now, and the outcome depends on the choices we make in the near term. To help clarify what’s at stake, we modeled four plausible futures shaped by how we manage AI’s energy demands.
Four potential futures—and one critical choice
Which of the four futures below becomes reality depends on our actions—through policy, design, infrastructure, and disciplined execution.
Sustainable AI: This is the outcome we aim for. AI systems have become more efficient by design. Data centers are optimized for precision, not excess. Grid modernization keeps pace, and smart planning channels demand where it can do the most good. In this future, AI supports grid stability and accelerates decarbonization, becoming part of the solution, not just a driver of growth.
Abundance Without Boundaries: Efficiency gains lead to runaway use. AI gets cheaper to run, so usage explodes. Data centers scale rapidly, and grid demand climbs with them. Growth continues, but it becomes harder to manage, and the energy benefits are erased by volume. The environmental strain grows, and the grid feels the pressure.
Limits to Growth: Here, the bottlenecks win. Grid infrastructure can’t keep up with demand. Permitting takes too long. Regulatory systems lag—projects stall—not because AI falls short, but because power isn’t available. Innovation hits a ceiling imposed by infrastructure, not imagination.
Energy Crisis: The worst case. AI expands quickly, with little coordination or planning. The grid falls behind. Regions face capacity shortfalls, and instability spreads. Costs rise. Emissions go up. AI’s potential remains, but it becomes a risk to the systems it depends on.
These futures aren’t set—they reflect tradeoffs already taking shape. Turning risk into resilience will depend on whether we make the right moves as quickly as possible.
Three pathways to sustainable AI
Avoiding disruption and managing growth calls for focused action across three fronts:
- Bend the Curve of AI Demand
AI systems should be built for efficiency from the start. That means computing hardware with lower energy intensity, smarter algorithms, and data centers designed to minimize waste and maximize precision.
- Strengthen and Modernize the Grid
The grid needs upgrades that match the pace and shape of new demand. Faster permitting, expanded transmission, local storage, and responsive control systems can create the flexibility necessary to keep the energy flowing where and when it’s needed.
- Make Data Centers Part of the Solution
Data centers can do more than consume power. With investment and coordination, they can support the grid by storing energy, managing load, and operating as flexible assets that improve system stability.
If we get this right, AI can accelerate the energy transition—not derail it. It can drive investment in modern infrastructure and clean energy rather than forcing dependence on legacy fuels and inefficient expansion.
Conclusion: Powering intelligence with purpose
We are entering a new phase where intelligence is built into machines, cities, and entire industries. But intelligence without structure is noise. As Claude Shannon wrote, “The fundamental problem of communication is that of reproducing at one point a message selected at another.” The same principle applies to energy. The system must deliver electricity—cleanly, reliably, and with precision—to where it’s needed, when it’s needed, to support technologies built to learn and adapt.
Every wave of innovation reshapes the systems that support it. The demands of AI will reshape energy infrastructure in lasting ways. Whether those changes lead to greater resilience or greater risk depends on the choices we make now.
Edison built a grid to power lightbulbs. He’d be inspired by a future where that same grid must power intelligence itself—and be excited by the scale of the challenge. This moment calls for invention, focus, and action. There’s a better way forward. Together, we can build it. Download the full report, Powering Sustainable AI in the United States, to explore the path ahead.
Add a comment