The triangle that explains AI’s energy problem—and the choices we face

A forgotten triangle with new relevance

Artificial intelligence feels weightless. We type a prompt and answers appear, as if from nowhere. But the reality is physical: every calculation requires silicon, circuits, and above all, electricity. As demand for AI grows, so does the pressure on grids and infrastructure.

Decades ago, Swiss scholar Daniel Spreng proposed a simple but powerful idea: no system can maximize three things at once—energy, information, and time. Push forward on two, and the third becomes the constraint. This principle, known as Spreng’s Triangle, was initially developed to describe thermodynamic limits in computation. In idealized physical systems, you can process information quickly using lots of energy, or slowly using less energy, or efficiently by taking more time—but you cannot optimize all three simultaneously.

Spreng's Triangle

Source: Spreng, D. (1993).1

Spreng’s insight was that these three quantities are not independent variables. They form a triangle of constraints, where movement toward one corner creates pressure on the others. A high-speed process demands energy. An energy-efficient process takes time. Perfect information processing at zero energy cost requires infinite time—the thermodynamic limit where nothing happens.

From physics to energy systems

What does a principle from thermodynamics tell us about national AI infrastructure? Spreng’s framework cannot be directly applied to complex socio-technical systems; however, it highlights a truth that guided our recent study at Schneider Electric’s Sustainability Research Institute: digital growth is inseparable from physical limitations, and trade-offs are fundamental, not incidental.

When we set out to model AI’s electricity demand through 2030, we built a system dynamics model grounded in real data on data centers, grid capacity, chip production, and technology adoption rates. The model doesn’t use Spreng’s triangle mathematically. But his conceptual framework helped us recognize that our scenarios are fundamentally about systemic interactions between energy, information, and time:

  • Energy: The electricity consumed by AI systems
  • Information: The compute infrastructure built to process workloads
  • Time: The speed of deployment, buildout, and availability

Each scenario represents a distinct way in which these three dimensions interact as the system evolves. They are not predictions, but possible trajectories based on how infrastructure, policy, and adoption decisions play out.

Access the report: Powering sustainable AI in the United States

Four scenario paths

Our system dynamics model defines four scenarios, each reflecting different priorities and constraints in how energy, information infrastructure, and deployment timelines interact:

  • Sustainable AI assumes moderate growth in model complexity and widespread adoption of energy-aware design. This includes investment in efficient hardware, improved thermal systems, and model optimization. Demand reaches 261 TWh by 2030, growing at a decelerating rate after 2028. This scenario requires coordination between industry, utilities, and regulators. Growth is limited by deliberate choice, not physical constraint.
  • Limits to Growth assumes that infrastructure bottlenecks, permitting delays, and regulatory caps hinder the energy system’s expansion. AI development slows accordingly. By 2030, demand reaches only 127 TWh. This reflects real-world constraints such as data center interconnection delays, transformer shortages, and limited workforce capacity. It is not an engineered slowdown—it is a structural limit.
  • Abundance Without Boundaries assumes unconstrained growth. Demand reaches 504 TWh, creating energy competition between AI and other electrification priorities. The system stretches to accommodate rapid deployment, but consequences emerge in the form of grid instability, thermal overloading, and volatile pricing. Efficiency gains exist, but scale overwhelms them.
  • Energy Crisis reflects unsustainable growth that peaks early and then reverses. Demand reaches 254 TWh in 2028 but falls to 186 TWh by 2030 due to cascading failures: infrastructure delays, regional energy shortfalls, political backlash, and emergency regulatory interventions. This is not simply a worst-case scenario—it is a consequence of ignoring tradeoffs and overestimating system elasticity.

Why this matters now

Spreng’s triangle forces us to see underlying tradeoffs. Our system dynamics model shows how these might unfold across different trajectories. Together, they deliver a clear message: AI growth will not escape the laws of physics, but society retains agency in how those constraints are managed.

Governments determine how fast grid capacity expands. Regulators set permitting timelines. Companies decide whether efficiency is an afterthought or a design principle. Consumers influence the pace of adoption. These choices shape outcomes.

The triangle does not offer solutions. It helps frame the central question: Which of the three—energy use, infrastructure capacity, or deployment speed—are we willing to constrain?  Physics will not change. The choices still can. For more details, explore Powering Sustainable AI in the United States.

[1] Spreng, D. (1993). Possibilities for substitution between energy, time and information. Energy Policy, 21(1), 13-23. https://www.sciencedirect.com/science/article/abs/pii/030142159390204S

Add a comment

All fields are required.