AI and Energy: Are we dramatically underestimating future needs?
AI energy needs are set to explode with agents. Here is one scenario.
In this post, I’m suggesting that there might be a fundamental misconception about how much energy AI will consume in the coming years. To grasp why our current projections might likely be far too conservative, we need to look back at another transformative period in human history: the Industrial Revolution.
The Industrial Revolution: A Lesson in Energy Transformation
For centuries before industrialization, per capita energy consumption stayed relatively flat. As societies shifted from agricultural to urban lifestyles, energy use grew, but the changes were rather gradual. Then the Industrial Revolution hit, and everything changed.
Suddenly, humans could build machines far stronger than themselves. Steam boats, trains, factories, and later planes and industrial processes emerged – all extremely energy-intensive. As a consequence, per capita energy consumption increased dramatically – by roughly a factor of 5 within a century.
This shift happened for two reasons. First, because entirely new capabilities became possible. This led to lifestyles we now take for granted: year-round heated housing, constant access to food and manufactured goods (including medicines), and the ability to travel anywhere by car or plane. Second, because not participating in these activities was a massive economic disadvantage. The vast economic value creation came from leveraging this newfound power of extreme physical capabilities of machines.
This is crucial to understand for the key argument here: this surge in energy wasn’t just to do more of what we were already doing, but to do things we hadn’t done before, or to do them at previously unimaginable scales. To move around, we didn’t just get more horses - we got trains and planes. This, in turn, enabled massive economic growth, globalization, and all the benefits (and downsides) that came with it.
The Recent Past: Efficiency and Climate Awareness
More recently, in the past 50 years, the changes in lifestyle haven’t been as dramatic. Sure, things have become more comfortable, medicine has advanced, and technology has improved, but the basic framework remains similar to what it was when I was born, at least in Europe and North America. Economic value wasn’t generated anymore by building ever more physically powerful machines, but by miniaturizing and automating the processes with computers. As a consequence, we’ve become much more efficient at doing what we started after the Industrial Revolution, leading to relatively flat or even reduced per capita energy usage in developed regions (though some countries continued to grow in energy usage per capita).
Meanwhile, more of the world joined the club to enjoy the fruits of the revolution, and total energy usage became an issue. Growing up in the 80s and 90s, I remember the constant focus on saving energy - doing the same with less. Today, we face an even stronger message: fly less, avoid multiple cars, reduce heating and A/C, cut back on consumption. But it turns out to be a tough sell. Despite massively growing climate awareness, roads are fuller, air traffic has rebounded, and Green parties have lost quite a bit of traction. Clearly, something isn’t working. The notion that we will solve energy issues by strongly reducing energy use hasn’t taken off anywhere yet, and it likely never will.
The Real Problem: Not Energy, But Its Sources
The core issue isn’t energy consumption itself, but rather how we generate it. All energy production methods have downsides, but fossil fuels have an obvious one: climate change. This impact is becoming impossible to ignore – from increased flooding and droughts to unbearable summers and disappearing winter snow, not to mention impacts on disease patterns, agriculture, and the habitability of certain regions.
In the middle of this escalating challenge, our energy story is likely to take another dramatic turn because of AI. Modern AI - i.e. the large foundational models - requires energy in two ways: training the model, and then deploying it (inference). While the energy required for training AI models is substantial, the largest part falls to inference, i.e. use of the models after training. The real energy demand will thus come from deployment, when hundreds of millions of people use these models daily.
Here’s where many analyses potentially go very wrong: they often compare something like a ChatGPT request to a Google search query. While it’s a fun analogy, it vastly underestimates the coming energy demands because it assumes human usage patterns. But in the near future, it won’t just be humans making these requests. We’ll each have dozens or even hundreds of AI agents working on our behalf, making hundreds of requests to various AI models to solve problems we haven’t even thought of yet.
Let’s do a quick back-of-the-envelope calculation. It’s estimated that a GPT request uses about 3 Wh (10x the energy of a Google search). Let’s assume clever engineering brings that down to 1 Wh, even as the models get bigger and more powerful. Current estimates suggest 100 million active GPT users. Let’s assume each of those makes on average 10 requests a day, either through the chatbot interface, or through the API. That activity alone already consumes about 1 Billion Wh, or 1 GWh, per day. Now project to a future where a billion people use these models, each with 10 agents making about 1000 requests daily. This gives us: 1 billion users × 10 agents × 1000 requests/day × 1 Wh = 10 trillion Wh = 10,000 GWh = 10 TWh per day. That’s already about 1/7 of global electricity production.
Per year, this would correspond to 3650 TWh. If we assume that the US will likely represent about 300 million users, this boils down to a yearly need of 1000 TWh in the US alone. Compare this to a recent report by McKinsey, which projects about 600 TWh use for all data centers in the US, for all uses.
Now, you may argue that these numbers are too high, but if something is off, it’s likely the number of agents or requests, not the overall picture. For example, it’s hard to imagine fewer than a billion people using AI (it’ll likely be more), and it’s hard to imagine much less than 1 Wh per AI request, on average. Yes, some models might be smaller and more efficient, but others will be far larger. Projecting anything substantially below 1 Wh per request might be wishful thinking at this stage. Perhaps the entire scenario with all of us having multiple 24/7 agents is wrong, but I very much doubt that.
What’s more, if you think 10 agents each making 1000 requests a day is too high, I’d counter that it might even be too low. Why not hundreds of agents? Also don’t forget that all your apps, and all your appliances, will probably use LLMs. In general, bringing more intelligence to the table will allow me to do more in the same time, and thus become much more efficient than my peers. In any economic setting, I will be incentivized to do that. Unless, of course, the energy is so constrained that it becomes too costly. Which might easily happen, because if we are just going up one additional order of magnitude in our calculations, we are already looking at using more electricity for AI than we produce today, worldwide.
The Intelligence Arms Race
This brings me to a potentially uncomfortable reality: choosing to limit AI energy use means choosing to be intelligence-limited. If your competitors are willing to use 10 times more energy for AI (10x more agents, or 10x more requests), they'll bring more AI intelligence to the table. Even if it’s not 10x more intelligence, it might be 2-3x more. You’ll be outsmarted. It's hard to imagine many scenarios where being intelligence-constrained becomes an asset rather than a massive disadvantage.
Yes, the numbers and assumptions here are debatable. This isn’t a hard prediction; it’s a scenario. But if this scenario plays out, we’re in trouble. And unless there’s some global agreement to cap AI energy use (which, given our track record with international agreements, seems unlikely), we’re looking at potential energy demands that current projections drastically underestimate.
The Nuclear Question
This is part of why major tech companies are increasingly looking toward nuclear power. They’re concerned about securing enough energy for their data centers, especially as projections grow. They understand that renewables alone might not be enough. And they want to avoid fossil if they can. Plus, so far, we’ve only discussed AI - there’s also the transition from fossil fuels to electricity, including the shift to electric cars.
While nuclear power has minimal greenhouse gases, it comes with its own well-known challenges: the risk of catastrophic accidents and the problem of long-term radioactive waste storage. It’s one of the reasons I’ve recently become extremely interested in new nuclear technologies. A few weeks ago, I had the privilege to visit one of the leaders in the field, Geneva-based Transmutex. I’ll write more about them in a follow-up post.