OP, this statement is bullshit. you can do about 5 million requests for ONE flight.
i’m gonna quote my old post:
I had the discussion regarding generated CO2 a while ago here, and with the numbers my discussion partner gave me, the calculation said that the yearly usage of ChatGPT is appr. 0.0017% of our CO2 reduction during the covid lockdowns - chatbots are not what is kiling the climate. What IS killing the climate has not changed since the green movement started: cars, planes, construction (mainly concrete production) and meat.
The exact energy costs are not published, but 3Wh / request for ChatGPT-4 is the upper limit from what we know (and thats in line with the appr. power consumption on my graphics card when running an LLM). Since Google uses it for every search, they will probably have optimized for their use case, and some sources cite 0.3Wh/request for chatbots - it depends on what model you use. The training is a one-time cost, and for ChatGPT-4 it raises the maximum cost/request to 4Wh. That’s nothing. The combined worldwide energy usage of ChatGPT is equivalent to about 20k American households. This is for one of the most downloaded apps on iPhone and Android - setting this in comparison with the massive usage makes clear that saving here is not effective for anyone interested in reducing climate impact, or you have to start scolding everyone who runs their microwave 10 seconds too long.
Even compared to other online activities that use data centers ChatGPT’s power usage is small change. If you use ChatGPT instead of watching Netflix you actually safe energy!
Water is about the same, although the positioning of data centers in the US sucks. The used water doesn’t disappear tho - it’s mostly returned to the rivers or is evaporated. The water usage in the US is 58,000,000,000,000 gallons (220 Trillion Liters) of water per year. A ChatGPT request uses between 10-25ml of water for cooling. A Hamburger uses about 600 galleons of water. 2 Trillion Liters are lost due to aging infrastructure . If you want to reduce water usage, go vegan or fix water pipes.
If you want to look at it another way, if you assume every single square inch of silicon from TSMC is Nvidia server accelerators/AMD EPYCs, every single one running AI at full tilt 24/7/365…
Added up, it’s not that much power, or water.
That’s unrealistic, of course, but that’s literally the physical cap of what humanity can produce at the moment.
I’m not sure what you’re referencing. Imagegen models are not much different, especially now that they’re going transformers/MoE. Video gen models are chunky indeed, but more rarely used, and they’re usually much smaller parameter counts.
Basically anything else machine learning is an order of magnitude less energy, at least.
The combined worldwide energy usage of ChatGPT is equivalent to about 20k American households.
Or about 10 small countries.
Not even being that hyperbolic: American households are fabulously, insanely wasteful of energy.
The rest of the world (barring places like Saudi Arabia, which are rarely used as moral or socio-cultural examples the world should learn from) has done the whole ‘What’s the point in trying to better the world when America and China do more damage than the rest of the world combined?’ debate decades ago, and we ended up deciding that we can’t control the worst offenders, and can only do what we can.
Literally any moral value or standard is subject to ‘but but but what’s the point if you can’t eradicate the problem entirely?’, that’s why it’s such a weak fallacy.
Minimising absolutely pointless destruction of non-renewable resources won’t successfully save the environment tomorrow, but we can do it anyway, and if will help.
We can’t eradicate theft, but we can do our best to pay for things before taking them. We know that being polite in public isn’t the 1 thing holding our society back from utopian perfection, but we do it anyway, because it helps.
We can all pinky promise not to murder or violently assault anyone, and pay no attention to the weirdo protesting that ‘What’s the point in not assaulting people when actually, cars and illness and unhealthy lifestyles do more harm’, because that person is presumably just looking for an excuse to hit someone.
And yeah, long story short: using ‘American households’ as an example of how insignificant AI’s energy usage is is kinda like saying smoking is safe because it’s actually less harmful than spending 6 hours a day on a busy road in Delhi.
If you don’t spend 6 hours a day near busy roads in Delhi, you won’t exactly think ‘oh that’s ok then’.
And if you do, your lungs need all the help they can get and you’ve got all the more reason to be wary of smoking (I say this as a smoker btw).
Huge areas of Africa and the middle east are becoming uninhabited because of climate change. Those people all need food and water, and the western world does not have the resources or inclination to house and feed them all. It is almost unanimously described as the worst crisis humanity has ever faced, and the practical solution - stop wasting fossil fuels and non-renewable resources when there’s a viable alternative - is so insanely easy.
Billions of lives could be saved, if everyone on the planet agreed to be mindful of energy waste.
Not ‘stop using energy’ or ‘everybody become vegan and live in houses made of recycled banana peel’, just quit wasting.
But there are entire countries who don’t seem to get the whole ‘acting together for the betterment of humanity’ thing, so that incredibly simple solution won’t work.
And all we can do in the meantime is to lead by example, make ‘responsible consumption’ a lifestyle rather than an option, and hope against hope that enough Americans and Chinese people decide to reduce their dependence on 1000 daily images of shrimp Jesus or an endless output of bullshit papers written by AI to pretend that’s what science means, in time to maybe save some of the planet before wildfire season lasts 12 months a year.
Also: it’s not even like you’re gaining anything from constantly using AI or LLMs. Just fleeting dopamine hits while your brain cells wither. Of all the habits one could try to reduce, or be mindful of, to literally save lives and countries, anybody who honestly thinks generative AI is more important is very addicted.
But there are entire countries who don’t seem to get the whole ‘acting together for the betterment of humanity’ thing,
I would describe it as ‘indoctrinated by Big Oil’, heh… It is awful.
Also: it’s not even like you’re gaining anything from constantly using AI or LLMs. Just fleeting dopamine hits while your brain cells wither. Of all the habits one could try to reduce, or be mindful of, to literally save lives and countries, anybody who honestly thinks generative AI is more important is very addicted. Also also: it’s just so shit.
The majority of text ingestion/token generation is consumed by other machines for stuff like coding assistants or corporate data processing, and this includes image ingestion. I dunno what fraction is image/video generation is, but I suspect it’s not high, as there’s really no point outside of cheap spam.
You are not wrong, and corpo AI is shit for plenty of reasons (including being needlessly power hungry when it doesn’t have to be), but I’m not relenting that this is a ‘small fish’ issue to pursue in reference to the massive waste in so many other parts of the US.
Big Oil and such delight in such distractions because it draws attention away from their more profitable and harmful sectors they’d rather people forget about.
Do you have a source for this claim? I see this report by Google and MIT Tech Review that says image/video generation does use a lot of energy compared to text generation.
Taking the data from those articles, we get this table:
AI Activity
Source
Energy Use (per prompt)
Everyday Comparison
Median Gemini Text Prompt
Google Report
0.24 Wh
Less energy than watching a 100W TV for 9 seconds.
High-Quality AI Image
MIT Article
~1.22 Wh
Running a standard microwave for about 4 seconds.
Complex AI Text Query
MIT Article
~1.86 Wh
Roughly equivalent to charging a pair of wireless earbuds for 2-3 minutes.
Single AI Video (5-sec)
MIT Article
~944 Wh (0.94 kWh)
Nearly the same energy as running a full, energy-efficient dishwasher cycle.
“Daily AI Habit”
MIT Article
~2,900 Wh (2.9 kWh)
A bit more than an average US refrigerator consumes in a full 24-hour period.
I like that as well, thank you! Yeah, the “Daily AI Habit” in the MIT article was described as…
Let’s say you’re running a marathon as a charity runner and organizing a fundraiser to support your cause. You ask an AI model 15 questions about the best way to fundraise.
Then you make 10 attempts at an image for your flyer before you get one you are happy with, and three attempts at a five-second video to post on Instagram.
You’d use about 2.9 kilowatt-hours of electricity—enough to ride over 100 miles on an e-bike (or around 10 miles in the average electric vehicle) or run the microwave for over three and a half hours.
As a daily AI user, I almost never use image or video generation and it is basically all text (mostly in the form of code), so I think this daily habit likely wouldn’t fit for most people that use it on a daily basis, but that was their metric.
The MIT article also mentions that we shouldn’t try and reverse engineer energy usage numbers and that we should encourage companies to release data because the numbers are invariably going to be off. And Google’s technical report affirms this. It shows that non-production estimates for energy usage by AI are over-estimating because of the economies of scale that a production system is able to achieve.
Edit: more context: my daily AI usage, on the extremely, extremely high end, let’s say is 1,000 median text prompts from a production-level AI provider (code editor, chat window, document editing). That’s equivalent to watching TV for 36 minutes. The average daily consumption of TV in the US is around 3 hours per day.
OP, this statement is bullshit. you can do about 5 million requests for ONE flight.
i’m gonna quote my old post:
If you want to look at it another way, if you assume every single square inch of silicon from TSMC is Nvidia server accelerators/AMD EPYCs, every single one running AI at full tilt 24/7/365…
Added up, it’s not that much power, or water.
That’s unrealistic, of course, but that’s literally the physical cap of what humanity can produce at the moment.
If you only include chat bots, your numbers look good. Sadly reality isn’t in “chat bots”.
I’m not sure what you’re referencing. Imagegen models are not much different, especially now that they’re going transformers/MoE. Video gen models are chunky indeed, but more rarely used, and they’re usually much smaller parameter counts.
Basically anything else machine learning is an order of magnitude less energy, at least.
Or about 10 small countries. Not even being that hyperbolic: American households are fabulously, insanely wasteful of energy.
The rest of the world (barring places like Saudi Arabia, which are rarely used as moral or socio-cultural examples the world should learn from) has done the whole ‘What’s the point in trying to better the world when America and China do more damage than the rest of the world combined?’ debate decades ago, and we ended up deciding that we can’t control the worst offenders, and can only do what we can.
Literally any moral value or standard is subject to ‘but but but what’s the point if you can’t eradicate the problem entirely?’, that’s why it’s such a weak fallacy. Minimising absolutely pointless destruction of non-renewable resources won’t successfully save the environment tomorrow, but we can do it anyway, and if will help. We can’t eradicate theft, but we can do our best to pay for things before taking them. We know that being polite in public isn’t the 1 thing holding our society back from utopian perfection, but we do it anyway, because it helps.
We can all pinky promise not to murder or violently assault anyone, and pay no attention to the weirdo protesting that ‘What’s the point in not assaulting people when actually, cars and illness and unhealthy lifestyles do more harm’, because that person is presumably just looking for an excuse to hit someone.
And yeah, long story short: using ‘American households’ as an example of how insignificant AI’s energy usage is is kinda like saying smoking is safe because it’s actually less harmful than spending 6 hours a day on a busy road in Delhi. If you don’t spend 6 hours a day near busy roads in Delhi, you won’t exactly think ‘oh that’s ok then’. And if you do, your lungs need all the help they can get and you’ve got all the more reason to be wary of smoking (I say this as a smoker btw).
Huge areas of Africa and the middle east are becoming uninhabited because of climate change. Those people all need food and water, and the western world does not have the resources or inclination to house and feed them all. It is almost unanimously described as the worst crisis humanity has ever faced, and the practical solution - stop wasting fossil fuels and non-renewable resources when there’s a viable alternative - is so insanely easy.
Billions of lives could be saved, if everyone on the planet agreed to be mindful of energy waste. Not ‘stop using energy’ or ‘everybody become vegan and live in houses made of recycled banana peel’, just quit wasting.
But there are entire countries who don’t seem to get the whole ‘acting together for the betterment of humanity’ thing, so that incredibly simple solution won’t work. And all we can do in the meantime is to lead by example, make ‘responsible consumption’ a lifestyle rather than an option, and hope against hope that enough Americans and Chinese people decide to reduce their dependence on 1000 daily images of shrimp Jesus or an endless output of bullshit papers written by AI to pretend that’s what science means, in time to maybe save some of the planet before wildfire season lasts 12 months a year.
Also: it’s not even like you’re gaining anything from constantly using AI or LLMs. Just fleeting dopamine hits while your brain cells wither. Of all the habits one could try to reduce, or be mindful of, to literally save lives and countries, anybody who honestly thinks generative AI is more important is very addicted.
Also also: it’s just so shit.
I would describe it as ‘indoctrinated by Big Oil’, heh… It is awful.
The majority of text ingestion/token generation is consumed by other machines for stuff like coding assistants or corporate data processing, and this includes image ingestion. I dunno what fraction is image/video generation is, but I suspect it’s not high, as there’s really no point outside of cheap spam.
You are not wrong, and corpo AI is shit for plenty of reasons (including being needlessly power hungry when it doesn’t have to be), but I’m not relenting that this is a ‘small fish’ issue to pursue in reference to the massive waste in so many other parts of the US.
Big Oil and such delight in such distractions because it draws attention away from their more profitable and harmful sectors they’d rather people forget about.
Could you explain further?
Image/Video generation, analysis (them scrubbing the entire public internet) consumes far, far more than someone asking an AT “grok is this true”
Do you have a source for this claim? I see this report by Google and MIT Tech Review that says image/video generation does use a lot of energy compared to text generation.
Taking the data from those articles, we get this table:
Another way of looking at this: A “Daily AI Habit” on your table is about the same as driving a Tesla 10 miles, or a standard gas car about 3 miles.
Edit 4 AI videos, or detour and take the scenic route home from work… about the same impact.
I like that as well, thank you! Yeah, the “Daily AI Habit” in the MIT article was described as…
As a daily AI user, I almost never use image or video generation and it is basically all text (mostly in the form of code), so I think this daily habit likely wouldn’t fit for most people that use it on a daily basis, but that was their metric.
The MIT article also mentions that we shouldn’t try and reverse engineer energy usage numbers and that we should encourage companies to release data because the numbers are invariably going to be off. And Google’s technical report affirms this. It shows that non-production estimates for energy usage by AI are over-estimating because of the economies of scale that a production system is able to achieve.
Edit: more context: my daily AI usage, on the extremely, extremely high end, let’s say is 1,000 median text prompts from a production-level AI provider (code editor, chat window, document editing). That’s equivalent to watching TV for 36 minutes. The average daily consumption of TV in the US is around 3 hours per day.
please elaborate?