Every time an AI model parses terabytes of data for insights or answers millions of user queries in seconds, it relies on a vast, hidden reservoir of computational power.
Despite improvements in the technology’s efficiency in the future, its power demand will only increase over time. As AI transforms industries and reshapes our digital landscape, one question looms large: will our power systems adapt to sustain this AI revolution?
THE GROWING ENERGY CRISIS
A single query to ChatGPT consumes about 2.9 watt-hours (Wh) of energy, nearly ten times that of a typical Google search, which uses around 0.3 Wh. Multiply that by billions of queries daily, and the total daily energy consumption rivals that of a small city’s entire electricity use.
Beyond the cumulative energy, there’s also the issue of power: the capacity needed at any moment to meet this demand. For example, in Texas, a major hub for digital infrastructure, peak power demand is projected to reach 152 gigawatts (GW) by 2030, almost double the grid’s current capacity. That’s like trying to power every home in Texas and California on the hottest day of the year — Twice. While such peaks are rare, the grid has a responsibility to ensure it can handle them.
This outpacing of energy supply by demand is occurring across the US. John Ketchum, CEO of NextEra Energy, estimates that U.S. power demand will grow by 40% over the next two decades, compared to just 9% growth in the past 20 years. And the global picture is just as stark: data centers currently consume 1-2% of global electricity, a figure expected to soar. By 2034, global energy consumption by data centers is projected to exceed 1,580 TWh annually, equivalent to the total electricity used by all of India in a single year.
5 | ADMISI - The Ghost In The Machine | Q1 Edition 2025
Unaccounted for in these estimates is the accelerating pace of AI innovation. Emerging reasoning models, such as OpenAI’s o3, require significantly more energy during inference as they process queries through step-by- step reasoning, mimicking human problem-solving. This iterative approach introduces a new tier of computational intensity, making inference increasingly expensive and marking a notable shift from traditional models, where training was the primary driver of energy consumption. Beyond these technical advancements, broader consumer adoption is set to amplify energy demand even further. From AI-powered assistants to autonomous vehicles and robotics, the integration of intelligent systems into everyday life will magnify their global energy footprint, pushing infrastructure to its limits.
FOR EXAMPLE, IN TEXAS, A MAJOR HUB FOR DIGITAL INFRASTRUCTURE, PEAK POWER DEMAND IS PROJECTED TO REACH 152 GIGAWATTS (GW) BY 2030
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32 |
Page 33 |
Page 34 |
Page 35 |
Page 36 |
Page 37 |
Page 38 |
Page 39 |
Page 40 |
Page 41 |
Page 42 |
Page 43 |
Page 44