The average ChatGPT request uses ~0.34Wh
Chatbots aren’t the problem. AI agents most likely will be.
I’ve written quite extensively about ChatGPT energy usage over the past few months (e.g. here, here, here, and here). Digging through various sources and papers, I finally estimated that a simple, text-only ChatGPT request consumes about 0.2 Wh.
This was in stark contrast to many other estimates floating around, often suggesting energy use at least an order of magnitude higher.
OpenAI had remained silent on this - until now. Yesterday, in a blog post, OpenAI co-founder and CEO Sam Altman revealed that “the average (ChatGPT) query uses about 0.34 watt-hours”.
Bingo. Of course, it would be nice to eventually see a detailed analysis. At this point, you could of course argue you don’t trust OpenAI / Sam Altman, and hence you don’t trust this data. But in as far as we take the CEO of the most important AI company by his word, the question is now settled.
0.34 Wh is still 70% higher than 0.2 Wh, but the important thing was to get the order of magnitude right, and given how much energy image and video generation uses, and that Sam is talking about the average query, I don’t see any reason to update my 0.2 Wh estimate for a text-only ChatGPT request.
Allow me therefore to repost this figure, which seems to have aged well (even though it is only a few months old, but in the AI world, that’s pretty old 😇).
Allow me also to make the point again that this doesn’t absolve us from thinking deeply about the coming explosion due to energy needs. Here as well, I am sticking to my guns, arguing that we may currently be underestimating our overall energy needs, and dramatically so.
In short: While a concern about exploding energy requirements in a world full of AI agents remains entirely defensible, regular chatbot use is really nothing to worry about in terms of energy use.
CODA
This is a newsletter with two subscription types. I highly recommend to switch to the paid version. While all content will remain free, all financial support directly funds EPFL AI Center-related activities.
To stay in touch, here are other ways to find me:
Interesting piece, Marcel. For me, energy consumption is only one issue with AI. If what you say about the energy consumption worries being overblown is true, then I'd love to see it used responsibly for beneficial purposes that did not steal intellectual property and undermine creative livelihoods.
But why are these figures so at odds with those that warn of the environmental impact? Is this just two sides of a propaganda war?
Hi Marcel, I recently read this article from MIT https://news.mit.edu/2025/explained-generative-ai-environmental-impact-0117
I think indeed individual consumption isn’t where “the problem” is with its 0.2kwh, but I also agree with the MIT article that combining the hype, low energy usage and commodity a ChatGPT request is becoming over a simple web search is just fuelling the energy hungry training of new models in the ratrace. No incentive for day to dat users to be conscious about their AI usage, reinforcing the “model factory” of OpenAI, Anthropic and the likes.
What do you think? Curious to know