27 Comments
User's avatar
Desmond Aubery's avatar

Excellent review. Thank you. Subscribed to your substack.

Expand full comment
Carl-Ola Danielsson's avatar

The car in the example is a gasoline car, which mainly converts the energy in the fuel to heat. Energy efficiency of EV cars are much higher.

Regarding the shower the calculations are correct and even a bit conservative I would say. Many showers use more water than 6-7l/min. If you have a heat pump not all of the energy used to warm the water shows up on your electricity bill. Domestic hot water production is a BIG part of the energy we consume in our homes.

Expand full comment
Carl-Ola Danielsson's avatar

Supposed to be a response to Gators questions?

Expand full comment
Timo Uustal's avatar

Good comparison, in terms people can relate to. Well done!

Expand full comment
Pier Paolo Monticone's avatar

A very clear and relatable way of presenting the topic—well done! One point that might round out the comparison is that data centers don’t just use energy; they often consume significant amounts of water for cooling as well. It would be great to see that aspect included to give an even fuller picture of AI’s resource footprint.

Expand full comment
f.'s avatar

isn't the water self-contained within the system?

Expand full comment
Pier Paolo Monticone's avatar

“A major tech company’s data centres can consume many billions of litres of water annually, in some cases rivalling the water consumption of major beverage companies,” says Shaolei Ren, an associate professor in electrical and computer engineering at the University of California Riverside. He estimates that global demand for AI processing will consume 4.2bn-6.6bn cubic metres of water abstracted from ground or surface sources in 2027.

" ref: https://www.ft.com/content/65fff689-bd47-4c15-bdb8-083e5ccd84dc?

Expand full comment
Mats Stafseng Einarsen's avatar

Sadly, this is based on an misunderstanding or lack of knowledge of how these chatbots work.

It appears you make an assumption that each chatbot message is an independent ~200 token unit, meaning we can divide the public estimates of energy use by 10. That's not the case. GPT does *completions*. Every new message exchange processes the whole conversation from the beginning.

If you've ever had a conversation with ChatGPT when you get the message that long conversations will make you reach your limit faster, you are actually sending and receiving messages that are 20,000 tokens or more for every message exchange. So the 2000 token estimate is probably much closer to reality (200 for the first message in a conversation but with a long tail of much higher message sizes).

I wish you'd address that, this has gone the rounds but it's misinformation about an important topic. Personally, I still believe a 10x energy usage is worth it for the value the AI generates, but it also moves it into the range where we need to take it into consideration.

Expand full comment
Marcel Salathé's avatar

Your comment is based on a misunderstanding about a misunderstanding. I am fully aware how these models work. We can argue about what the right numbers are, and I've made it clear multiple times that there are uncertainties, but they go in both directions.

Others come to similar estimates on different ways.

Please be careful about using the term misinformation. It should not be used lightly.

Expand full comment
Mats Stafseng Einarsen's avatar

I don't see your response engaging with the core of my criticism. It seems difficult to accept that the average request is only 200 tokens when all the major AI companies are using large context windows as key selling points, promoting capabilities of 128k to a million tokens. Your entire calculation hinges on revising the 2k estimate down to 200, but I don't see evidence supporting this significant adjustment. Unfortunately, your calculation is now out there, and while none of us may know the exact numbers, we can be fairly certain this is not it.

Expand full comment
Geo's avatar
Apr 3Edited

I have looked for serious and recent studies on this topic, and I have found values for the Chatbot use that are more than 10 times as high (bestbrokers et al). On top, the energy consumption of the ongoing model trainings is not being considered, nor other use of natural resources such as water, see below.

Expand full comment
Swamp Seer's avatar

Interested in a piece comparing water use for meat eating plz. Doesn't eating meat consume more water? Unless we talk about that, people getting upset about AI's environmental impact seems like performative virtue signaling. They do more damage when the eat lunch! Lol thanks

Expand full comment
Ronald's avatar

It's an interesting thought. However, is this AI use estimation a combined false equivalence and hasty generalisation? I don't initiate all AI related activity related to me - AI also use my data even without my knowledge - like shopping online for nutritional yeast (yes - keeping it vegan...) and posting on Workplace. As you no doubt know, when I permit my data to be used by AI, I am leaving a data-energy-footprint which can have a separate environmental impact, although I have not been the direct agent of AI use. Chatbot use is only one part of the AI data footprint, so I respectfully suggest that this analysis is not representative of AI energy usage. Regardless, I am now taking less baths and more showers...

Expand full comment
Cyril Matthey-Doret's avatar

The estimate of 0.2Wh for a typical interaction seems a bit optimistic to me: If I use the ecologits calculator (https://huggingface.co/spaces/genai-impact/ecologits-calculator) to check the consumption of open source models, say Llama 3.1 405B, it gives 95.4Wh for a short (400 token) conversation.

IIRC, OpenAI models are even larger, even if the infra is more optimized, I hardly see how the consumption could be 2 orders of magnitude smaller.

Am I missing something?

Expand full comment
Marcel Salathé's avatar

I do not know how they come up with these values. I laid out how I arrived at it, and it seems very much in line with what others have found, e.g. https://epoch.ai/gradient-updates/how-much-energy-does-chatgpt-use

Expand full comment
Cyril Matthey-Doret's avatar

Thanks, I missed that ref!

tl;dr: It seems it comes down to computing different things; above is the computing from just the GPU operations, whereas it looks like ecologits attempts to model the total energy increase on the server (RAM+CPU+Cooling+GPU) (1.).

I tried to go down the rabbit hole a bit; These numbers are based on the llm-perf-leaderboard (2.) generated by the optimum-benchmark project (3.) which empirically measures the consumption from running the models using codecarbon (4.). For the GPU, codecarbon uses pynvml (5.), which retrieves power draw from NVidia's management library (6.).

IIUC, ecologits fits parametric models on the empirical results from open source models to extrapolate to closed source models. I'm not sure about the assumptions done in the modeling + how similar the hardware of a datacenter is, but the benchmarking approach this seems reasonable to me.

1. https://ecologits.ai/latest/methodology/llm_inference/#modeling-server-energy-consumption

2. https://huggingface.co/spaces/optimum/llm-perf-leaderboard

3. https://github.com/huggingface/optimum-benchmark

4. https://mlco2.github.io/codecarbon/methodology.html#gpu

5. https://pypi.org/project/pynvml/

6. https://developer.nvidia.com/management-library-nvml

Expand full comment
Chase Saunders's avatar

You are talking about the use costs, which you have analyzed poorly (see the comments). The training costs far exceed them so far. The most reliable forms of analysis of costs versus benefit, both financial and in terms of e.g. the fact that these LLM's all leak entropy, show the the relative costs versus benefits are staggering.

Cherry picking the things that make the thing you want to exonerate look exonerated is not an intellectually honest method.

Expand full comment
Marcel Salathé's avatar

It's pretty obvious that this does not include the training, in the same way that the car comparison does not take the car production into account. Speaking about intellectual honesty...

Expand full comment
Chase Saunders's avatar

Do you mean... like your *own* car analogy?

Have you never heard of Life Cycle Assessment or Life Cycle Energy Analysis? Absolutely what people who care about good analysis of cars and other manufactured goods do is look at the embedded energy throughout the whole process.

How does an omission being "obvious" make an analysis better?

Expand full comment
Kurt Schwind's avatar

This doesn't seem to include the cost of the original training of the models in use. Just the cost of running a model post-training. Is that right? Shouldn't the cost of the training be spread over the cost of usage? And if so, I think this number is on the low side.

Expand full comment
Marcel Salathé's avatar

Yes, this does not include the cost of training, but in the same way that the car comparison, for example, does not include the cost of production.

Expand full comment
Kurt Schwind's avatar

Cost of electricity in the production of a hot water heater over the lifetime of the water heater is negligible. I don't think the same can be said of the AI models (at least not yet).

Expand full comment
Sarah Field's avatar

Really interesting graphical representation and like the way you allow us to relate to it in real terms. The issue for me is that this isn't instead of car journeys or showers it's as well. If we extrapolate this to entire populations then you are looking at significant additional energy use

Expand full comment
Marcel Salathé's avatar

You are right, and cumulatively this can be an issue. Of course cars are also widespread, we have just accepted their use. I would personally prefer a debate where we talk about a reduction in car use, rather than a limit on AI use.

Expand full comment
Geo's avatar

Gemini 2.5 cites various data sources and gives an answer that really looks quite different: the estimation of the author seems to be at the absolute lowest end possible, and only see a part of it all: https://aistudio.google.com/app/prompts?state=%7B%22ids%22:%5B%221hbqmzgBsJaf3XQcBHtIOoxblE49crY3G%22%5D,%22action%22:%22open%22,%22userId%22:%22107103139688692001248%22,%22resourceKeys%22:%7B%7D%7D&usp=sharing

Expand full comment
Gabor Levendovics's avatar

Interesting comparison. May I ask how a 10km car journey equals 7.6kwh? Any regular EV user can confirm that in town it's more around 15-17 kwh/ 100km, while even during winter with a SUV on the highway it is not more than 30 kwh/100 km.

The other question would be around the 5 min shower. Our family of 4 consumes 15kwh per day to run the house including cooking, washing machine, dryer, dishwasher and of course taking a shower.

The data seems a bit odd to me, hence asking

Expand full comment
Marcel Salathé's avatar

Car calculation is in the text, but you are of course right that EVs are more efficient, about factor 4 (as mentioned in the text).

Expand full comment