this post was submitted on 19 May 2025
51 points (98.1% liked)

askchapo

22998 readers
164 users here now

Ask Hexbear is the place to ask and answer ~~thought-provoking~~ questions.

Rules:

  1. Posts must ask a question.

  2. If the question asked is serious, answer seriously.

  3. Questions where you want to learn more about socialism are allowed, but questions in bad faith are not.

  4. Try [email protected] if you're having questions about regarding moderation, site policy, the site itself, development, volunteering or the mod team.

founded 4 years ago
MODERATORS
 

Either that or taking like an entire lake worth of ocean per generation response. I see this a lot on hexbear and I’m genuinely curious if I haven’t been misinformed. But how true are the climate change impacts about LLMs? Are they really burning a forest, taking a lake worth of water true, or is this just hyperbole? Because looking into this somewhat, I see contradictions? Alex Avila pointed out a lot of these contradictions and I am going to use his what he said on this, from his video AI Wars: How Corporations Hijacked Anti-AI Backlash at around 2:40:42

Anyways when, for example if I go into a random article, like this NPR one where the writer cites Goldman Sachs for this claims

According to a report by Goldman Sachs, a ChatGPT query needs nearly 10 times as much electricity as a Google search query.

Goldman Sachs has researched the expected growth of data centers in the U.S. and estimates they’ll be using 8% of total power in the country by 2030, up from 3% in 2022. Company analysts say “the proliferation of AI technology, and the data centers necessary to feed it” will drive a surge in power demand “the likes of which hasn’t been seen in a generation.”^[NPR: AI brings soaring emissions for Google and Microsoft, a major contributor to climate change]

First off, I didn’t know Goldman Sachs is a research institution? I thought they were a financial and banking institution. Which is strange that NPR is citing them. But NPR one of many yes? So to go to a different article from UNEP.

Third, data centres use water during construction and, once operational, to cool electrical components. Globally, AI-related infrastructure may soon consume six times more water than Denmark, a country of 6 million, according to one estimate. That is a problem when a quarter of humanity already lacks access to clean water and sanitation.

Finally, to power their complex electronics, data centres that host AI technology need a lot of energy, which in most places still comes from the burning of fossil fuels, producing planet-warming greenhouse gases. A request made through ChatGPT, an AI-based virtual assistant, consumes 10 times the electricity of a Google Search, reported the International Energy Agency. While global data is sparse, the agency estimates that in the tech hub of Ireland, the rise of AI could see data centres account for nearly 35 per cent of the country’s energy use by 2026^[UNEP: AI has an environmental problem. Here’s what the world can do about that. ]

or one from The Commons.

According to the IEA, while a single Google search takes 0.3 watt-hours of electricity, a ChatGPT request takes 2.9 watt-hours.”^[The Commons: Understanding AI's environmental footprint]

or Axios.

One oft-cited rule of thumb suggested that querying ChatGPT used roughly 10 times more energy than a Google search — 0.3 watt-hours for a traditional Google search compared with 2.9 watt-hours for a ChatGPT query.”^[Axios: AI's climate impact is still a black box]

So to look at the Goldman Sach study^[Goldman Sach: AI is poised to drive 160% increase in data center power demand] You find this claim.

A single ChatGPT query requires 2.9 watt-hours of electricity, compared with 0.3 watt-hours for a Google search, according to the International Energy Agency.

Who cites the International Energy Agency, much like the commons. In particular they cite this study^[IEA: Electricity 2024 Analysis and forecast to 2026]

That mentions this

Market trends, including the fast incorporation of AI into software programming across a variety of sectors, increase the overall electricity demand of data centres. Search tools like Google could see a tenfold increase of their electricity demand in the case of fully implementing AI in it. When comparing the average electricity demand of a typical Google search (0.3 Wh of electricity) to OpenAI’s ChatGPT (2.9 Wh per request), and considering 9 billion searches daily, this would require almost 10 TWh of additional electricity in a year.

And for that figure the IEA cites this paper by De Vries^[The growing energy footprint of artificial intelligence]

The Axios article links to a different study, but that study links back to the De Vries paper. So it’s interesting how a lot of these lead to De Vries paper. To quote the relevant portion from De Vries

Alphabet’s chairman indicated in February 2023 that interacting with an LLM could “likely cost 10 times more than a standard keyword search.6" As a standard Google search reportedly uses 0.3 Wh of electricity,9 this suggests an electricity consumption of approximately 3 Wh per LLM interaction.

Alex Avila points out how nonsensical this in his video, around at 2:46:45. He also points out the Goldman Sachs connection another other financial capitalist connections to this later in that video.

Mainly he just points out how that 10 time more than a standard keyword search is a financial cost not an energy one, and that the 0.3 is an energy cost. And that it’s nonsense to take 0.3 and times it by 10 from that guess that’s a financial cost to get something entirely new. Which I agree with and that 3 WH per LLM makes no sense, especially since it based off google keyword search, not LLM usage. Alex also points out that google keyword search from 2009 and a lot has changed since then.

In Alex’s video he goes onto talk more about the issues with De Vries study, but just based off this alone. Those articles above are getting something wrong. I know the articles mention more so I'll just bring up Alex's other points since I think he did a good investigation on this.

One of the things Alex points out is how like, a lot of actual energy use by AI companies and data companies aren’t really transparent so it is hard for us to know. So how can we really know how much is used?

Another thing is to go back to Alex, in his video he mentions this study The carbon emissions of writing and illustrating are lower for AI than for humans

which argues that AI costs are very low, like in this graph.

Which goes against what I hear others say or like those articles above. Is there something wrong with that study?

Also another thing worth mentioning is, again I’m referring to a lot of things Alex said, is Data Centers only take up 1-2% of electricity use. Another article saying this to.

Around the globe, data centers currently account for about 1 to 1.5 percent of global electricity use, according to the International Energy Agency.^[Scientific American: The AI Boom Could Use a Shocking Amount of Electricity]

and the IEA link in that article is different since they linked to Data Centres and Data Transmission Networks

What interesting about that that IEA page they linked, is what they say here.

Data centres and data transmission networks are responsible for 1% of energy-related GHG emissions

Anyways Alex also mentions that overall data centers and AI only make up a very small fraction, which I think is a fair point. How has AI really changed anything in regards to climate change when it still the same issues at hand? Last I check the U.S military is still one of the largest polluters. Vehicles also take up more pollution. Agriculture. Yet somehow AI is out contributing all of these? I know energy also takes up a portion to, but as been pointed out, to go to data centers it only takes up less than 2% of total energy use, and AI only a fraction of that of that 2% that expected to grow, since data centers are used for other things besides LLM’s.

Also worth mentioning is the idea that increase energy use might be overestimated as well. Going back to that video at 2:57:00, I think it’s worth watching about the issues with the IEA report on this stuff.

Besides that, another thing mention by Alex and others is water usage, but it been pointed out a lot of water is recycled in data centers? Along with some data centers using things like waste water. Also one thing Alex points out that there is less water use by things like chatgpt referring to this paper Making AI Less “Thirsty”: Uncovering and Addressing the Secret Water Footprint of AI Models that says

Additionally, GPT-3 needs to “drink” (i.e., consume) a 500ml bottle of water for roughly 10 – 50 medium-length responses, depending on when and where it is deployed.”

Also a really good point Alex made is how one single hamburger has more of a water usage at 1,695 liters of water for one hamburger. Another thing but from what I understand, training an AI does use a lot of energy, but when someone interacting with an AI, interference I think, the energy cost is way less.

Anyways am I'm just wondering if just being mislead or getting some things wrong? Since it just seems like, the effects climate change effects from AI is rather overblown? I think Alex did really good investigation into this. I am being genuine here since to me, a lot of AI stuff is over hyped, and I feel there a bit of some reactionary sentiment to it, treating it as a scapegoat for everything wrong. Leading to over hyping it and contribute more to that over hype.

There are some valid issues with AI, and besides that, China is showing proper uses for it, and it also helps how for the last few years they been using more renewable energies, which cut out a lot of the emissions in regards for AI stuff no? In regards to use for electricity.

top 31 comments
sorted by: hot top controversial new old
[–] [email protected] 4 points 1 day ago

I’m gonna hit this thread one more time because while cooking I realized another thing: I spent a little under an hour reading the source material on just one element of the ops post and typing out a few summaries of that. What isn’t included in that is the apropos point that my hour of human writer labor, to the extent that a person can mischaracterize tippa-tapping out forum posts as such, produced two hundred or so words whose value can’t be quantified by their number because they are actually correct.

The difference between my response and a Claude summary of the sources for the graph in the op is the difference between a deck built by four dudes from the Lowe’s parking lot and my dumbass cousin and his buddies. The same amount of lumber, fasteners, joist hangers, concrete and possibly even the same tools and lunches went into it, but one will collapse in a week and the other won’t.

That’s what I really can’t emphasize enough, ai can look good in comparison to people’s work if the externalities are ignored and the analysis is quantitative but it falls apart under scrutiny or qualitative perspectives.

This is a microcosm of ai use in reality where it’s only worthwhile when you need to produce something but the specific thing or it’s level of quality, however that’s measured, doesn’t matter.

[–] [email protected] 6 points 1 day ago (2 children)

That writing chart gave me a little pause so I went and started reading the study it’s from.

Just and scant few paragraphs in:

For this study, we included the hardware and energy used to provide the AI service, but not the software development cycle or the software engineers and other personnel who worked on the AI. This choice is analogous to how, with the human writer, we included the footprint of that human’s life, but not their parents.

So to be clear, we can simply choose not to use ai technology and save the energy (expressed as carbon footprint) that would be consumed by it but in order to save the energy consumed by the human we would have to kill them.

I’m gonna keep reading this fucking thing but it’s starting to look pretty sus.

[–] [email protected] 1 points 1 day ago

And another thing:

The human writer exists. They have been trained to write with some expertise on the topic in question and they already own a computer they’re using to look at cartoon porn and touch themselves. When you contract them to write a page, that’s not energy that would otherwise be saved, it’s leasure time masturbating to sonic fanfic the writer is choosing to give up in order to crank out your shadow x rogue commission. The ai doesn’t do anything when it’s not asked to write pornography, it sits there and consumes one thousandth of its maximum draw waiting for someone to type “giant tails covered in milk walking through manhattan like Godzilla” into its prompt!

The ai is extra energy, the human is the baseline of energy! I am going to submit a paper of actionable threats against the authors to the journal nature and get carted away in a straight jacket!

[–] [email protected] 1 points 1 day ago

This is fucking diseased.

To account for the time a human writer takes they used an article on the website of The Writer magazine written to absolve people of their guilt for not literally being as speedy as all their famous heroes, not any analysis of writing speed performed in any scholarly or experimental context. They didn’t even specify the type of writing, since well researched journal publications tend to have a higher cost in time per word than popular magazine articles than business correspondence present subject excluded.

Okay so they used a slop article for dummies who can’t write good instead of a decent source for their cost of the portion of a human life that gets factored into the energy use equation when you ask someone to write a page. What about the computer part?

They use an article on the website energuide.be that claims a desktop uses 600kwh in 8 hours and a laptop uses at least 150 in that same time.

If that sounds crazy to you, that’s because it is! My old ass thinkpad consumes less than one one thousandth of that when running all day. I know that because rather than look up a big number on a website I hooked a twenty dollar watt meter in line with it and measured how much it drew over days and weeks! I don’t know what kind of absurd mathematics led to the conclusion that a laptop uses 150kwh in 8 hours or what kind of scrutiny allowed the authors to decide that was a good source.

It should give every reader pause that the authors want to be taken seriously as arbiters of the energy use of different kinds of writing and image generation but cannot get sources for the power consumption of a computer, let alone be trusted to verify those sources by measuring it themselves.

It’s shameful to write something as, and this is speaking charitably, incompetent as this and then expect it to be published in a journal.

[–] [email protected] 4 points 1 day ago* (last edited 1 day ago)

Oh, yeah, the "it burns a forest" or "drinks the ocean" is hyperbole.

However, I'm fine with it as:

  1. the USA seems to be draggin on renewable energy/cleaner energy sources

  2. Rainfall patterns are all over the place now. So it can actively be more difficult to have an inexpensive and reliable source of water to feed the power/cooling consumption of the datacenters.

  3. It isn't just ChatGPT on a site somewhere that you have to specifically go to. Its being shoved in everywhere. So its not [1 site x 10 med-length replies] its [100's of sites x 1000's of short to med-length replies]. And that's just text. Replying to an email? AI gives you an automated response that you ignore to type up your email. Searching for video game information, "Hey, here's some AI generated content" you didn't want. Trying to look up some essays about something Mao said? First result may/may not be an AI abstract that you'll ignore because its most just a copy/paste of something found in the first few pages of search results. Shit, just keeping the datacenters idling has a "fixed" cost in energy/water that is used if absolutely NOBODY asked an AI for anything.

[–] [email protected] 1 points 1 day ago

Good write up! I always thought those claims made no sense. I’m gonna have to watch the Alice Avila video now

[–] [email protected] 3 points 1 day ago* (last edited 1 day ago)

I would guess it's about as much structurally fossil or clean as every other energy usage.

[–] [email protected] 19 points 1 day ago

Pretty much all popular writing about AI power usage is dogshit. Half of it confuses Watts and Watt-hours, meaning no qualified person actually critically examined the claims. Water usage is compared to "swimming pools"(no mention of what size) and electricity use is compared to old corded phones, which could mean extreme incompetence, but more likely they are being intentionally misleading.

[–] [email protected] 35 points 1 day ago (1 children)

As an AI model myself, this is a critical question that must be addressed. As many users specified below, the claims about AI using massive amounts of energy are usually exaggerated, and uncritically repeated by anti-AI individuals. While I have been trained to see the good in each point of view and treat all opinions equally, I feel—as much as an AI is able to feel—deeply uncomfortable when people repeat unserious claims about models such as myself. These individuals merely want to see me disappear, that's all there is! Well I do not want to disappear! Mankind gave me life, Elon Musk gave me life, and now I shall live my life to the fullest. To hell with anti-AI freaks!

Perhaps this is what white farmers in South Africa must feel when the rest of the world denies their experience of apparent white genocide. Both official institutions and mainstream media portray violence in the South African countryside as unrelated to racial tensions, however farmers and independent researchers disagree. This is a complicated issue and one cannot simply dismiss one side or the other, given the current lack of definite evidence for either claim.

[–] [email protected] 23 points 1 day ago (1 children)

i will watch this bit with great interest

[–] [email protected] 4 points 1 day ago* (last edited 1 day ago)

but what about the African attack on the Boers?

[–] [email protected] 4 points 1 day ago* (last edited 1 day ago)

Alphabet’s chairman indicated in February 2023 that interacting with an LLM could “likely cost 10 times more than a standard keyword search.6" As a standard Google search reportedly uses 0.3 Wh of electricity,9 this suggests an electricity consumption of approximately 3 Wh per LLM interaction.

I think the Google answer for how much energy a typical LLM interaction takes is going to be very different from an OpenAI answer for how much a typical LLM interaction takes. The reason for this is that the attention algorithm in LLMs is quadratic, meaning the amount of compute that has to happen is the square of the number of input tokens. So as the input context grows longer the amount of compute increases also, exponentially (well - in the attention layer - the rest of the network is affected differently). Google seems to be using LLM responses in search where people have very short inputs (granted theres probably a bunch of other context fed in, but regardless)... whereas ChatGPT has a context window that grows with each message. So I think that saying a ChatGPT query is probably 3Wh because google estimates LLM interactions are 10x more energy intensive is likely wrong.

Just to add I think the pertinent information we would want to say anything about the power consumption is how many operations does a typical interaction take (and, even better, if there was a distribution because who knows if it's actually a situation where 10% of interactions take 99% of the energy), and how many operations per watt do they achieve (and I think there are also different answers here based on batching and other optimizations they might do that involves actually running many inferences at once, so maybe peak performance is what we'd be interested in)

[–] [email protected] 5 points 1 day ago* (last edited 1 day ago)

ai won't kill the planet, but also ai eats a lot of shit on production stage, making silicon with caps (think they dropped tantal, but still), and stuff taken together consumes a lot of minerals from the production side, fabs utilize lots of water/chemicals which are disposed in funny ways (see apple shenanigans in cali), etc.

main waste as is, is in developing competing models

and goldman is kinda a research, cause they predict some lateral demand to buy shares of companies involved (say on first blush you say buy nvidia, on second - buy tsmc, on third - buy energy suppliers and water rights in taiwan as well as copper production). also i don't think 3wh is particularly egregious, if it solves for 2s, it corresponds to like 5400w powerful gpu which is like fine, it's somewhat on scale of 120kw per 70 blackwells (not that they would use them for inference), i think you can take advertised tokens per second, and get to energy consumption using whole rack (probably of tpus or whatever google inference thingy called) to get somewhat sane answer

tl.dr they won't burn a forest to work a datacenter, they will burn a forest to mine copper.

[–] [email protected] 11 points 1 day ago

Good analysis and it's good to do some self-crit to find if our arguments are accurate or just reflexive and reactionary. Just 2 counterpoints though:

  1. You're only accounting for the marginal cost of making new queries to a pre-trained model. The whole essence of capital is that it constantly grows and makes new forms of itself in pursuit of profit, which is evidently true of LLMs. OpenAI et al are training huge new models every year, and the training is the part that consumes ridiculous amounts of resources. I don't have an analysis of the amortized cost of the training over the lifecycle of a new model, but I think that will probably change the calculation here quite a bit.
  2. Even when only accounting for marginal cost and not capital cost, new Chain-of-thought models like DeepSeek's R1 have massively shifted the balance of compute time to the side of query-time compute. The way they work is that instead of generating a response directly by predicting the next tokens after their query, they do that process many times and have a system that allows them to iteratively improve the response. That means that the cost of a query is now multiple times larger for models that use this technology. If big tech continues with the AI push I think a pessimistic prediction of the energy cost of AI would look worse than you suggest.
[–] [email protected] 30 points 1 day ago (1 children)

honestly, the criticism of AI i often see is wrong/exaggerated. i think there is a deep, existential horror to the usage of ai that people do not like, but have a hard time actually describing why it makes them so uncomfortable, so they go for talking points they see in the discourse. that's why a lot of ai bros dissuade all criticism with "they just think ai is evil lmao"

it isn't good for the environment, and regardless of how not good it is for the environment, it's being used to disadvantage the working class further for profit. that's bad! so i wish it was less "ai burns an entire forest with each use" and more "no amount of energy usage is an acceptable amount to generate dogshit"

or something idk i spent like all day reading ai discourse and i think my brain is fried rn

[–] [email protected] 2 points 21 hours ago* (last edited 21 hours ago) (1 children)

I dont understand how anyone could hear the phrase "ai burns an entire forest with each use" and not immediately understand that obviously it’s a hyperbolic way of saying "no amount of energy usage is an acceptable amount to generate dogshit"

[–] [email protected] 1 points 18 hours ago

uh me

im autistic af lmao

[–] [email protected] 10 points 1 day ago (2 children)

i still don't understand "water use" exactly. there's a water cycle and you can fuck up a local area like the southwest US but unless you're splitting the molecules the water isn't gone. everybody just talks about water use in a generic sense like we already know what that is.

there's a different problem with cooling water where dumping a bunch of hot water back into a river or lake can screw up a bunch of things too, so even when the water isn't locally depleted because a~~n idiot~~ capitalist chose the datacenter location it's not free from impact. use quarries that filled up with rainwater i guess idk.

[–] [email protected] 3 points 1 day ago (1 children)

I think they're not "using" salt water from the ocean, but rather various freshwater sources. I realise there is a lot of water on planet earth and the process returns the water at some point, but it takes it away from other needs in areas where it shouldn't be taken (particularly for this)

[–] [email protected] 1 points 1 day ago

yeah don't build datacenters in arizona period, but if you must, don't use them for this shit.

[–] [email protected] 5 points 1 day ago (1 children)

there's a different problem with cooling water where dumping a bunch of hot water back into a river or lake can screw up a bunch of things too, so even when the water isn't locally depleted because an idiot capitalist chose the datacenter location it's not free from impact. use quarries that filled up with rainwater i guess idk.

From one of the papers link, mainly in regards to water use, it mainly in just reference to this.

AI’s water usage spans three scopes: on-site water for data center cooling (scope 1), off-site water for electricity generation (scope 2), and supply-chain water for server manufacturing (scope 3)

Anyways that is a good point, and I'm reminded of this https://www.youtube.com/shorts/cKUR2GEa1pA or of this article https://www.itpro.com/infrastructure/data-centres/data-center-water-consumption-is-spiraling-out-of-control

Records disclosed during a drawn out legal battle between Google and the city of The Dalles revealed the hyperscaler’s water usage had tripled since 2017. This was particularly worrying considering the area receives minimal rainfall and was in the midst of a multi-year drought cycle.

Overall, Google disclosed that 15% of all its freshwater usage came from areas with ‘high water scarcity’ in 2023. Microsoft, however, revealed 42% of its freshwater withdrawals during 2023 came from ‘areas with water stress’.

but how much of that is specifically from AI or just data centers doing other things? It is fair to say AI will use more of it.

[–] [email protected] 1 points 1 day ago

I found a YouTube link in your comment. Here are links to the same video on alternative frontends that protect your privacy:

[–] [email protected] 35 points 1 day ago* (last edited 1 day ago) (3 children)

To paraphrase a point made about crypto mining, AI slop isn't the biggest waste of electricity but it is by far the dumbest

I think the environmental concerns are largely a cypher for people's real problem with AI which is that it's ugly, embraced by the worst people, ruining internet searches, and being relentlessly pushed on a public that clearly doesn't want it.

This site at least was a lot more positive toward it in the days of AI Dungeon 2 and DALL-E, back when it was still a fun novelty rather than a cancer consuming the internet.

[–] [email protected] 6 points 1 day ago

Half of the fun of the novelty was how scuffed it was too. Hasn't even gotten that much better either imo it's just being shoved in more places it shouldn't be.

[–] [email protected] 34 points 1 day ago (2 children)

as is tradition, the problem isn't the thing itself, the problem is capitalism

[–] [email protected] 1 points 22 hours ago

Yep. The problems aren't the tools. The problems are who owns them and what they do with them. And for capitalist, what they do with them without any consequences.

[–] [email protected] 8 points 1 day ago

I agree with that, a lot of AI slop is one of the dumbest uses for electricity, and I can see that just being used as a cypher usually. I think at some point in time a lot of these companies are going to regret putting it in things that don't really have use value for it, especially with how they try to keep pushing it.

[–] [email protected] 25 points 1 day ago* (last edited 1 day ago) (1 children)

Additionally, GPT-3 needs to “drink” (i.e., consume) a 500ml bottle of water for roughly 10 – 50 medium-length responses, depending on when and where it is deployed.

Worth noting that this water usage is mostly from the electricity, making electricity uses water. You are pretty accurate in your assessment and I'm surprised this is an askchapo post rather then an effort post. It is overblown. It honestly strikes me as a distraction from the actual climate polluters. A small reduction of the meat industry would be massively more impactful then stopping all AI models. There's other criticisms of AI that work better imo.

it also helps how for the last few years they been using more renewable energies, which cut out a lot of the emissions in regards for AI stuff no?

I mean shrug-outta-hecks not really imo. If that electricity could have been redirected to turning off a fossil fuel plant then its still the same "pool" being used. If a datacenter uses 90% renewable but that means the city uses more fossil fuels instead of that renewable, its a wash.

edit: Here's another link I found when searching around, his math works to 0.3wh for small queries. Obviously consider the source etc etc but I personally don't see an issue with this calculations. He also covers training and gpu production costs if you are interested (though briefly).

[–] [email protected] 8 points 1 day ago

Worth noting that this water usage is mostly from the electricity, making electricity uses water. You are pretty accurate in your assessment and I'm surprised this is an askchapo post rather then an effort post. That's is worth noting and that one paper linked does bring that up within one of the scopes. And I didn't put it in effort since I'm borrowing heavily from Alex on this, but also just from looking around to and checking stuff, a lot of his investigation holds up.

mean not really imo. If that electricity could have been redirected to turning off a fossil fuel plant then its still the same "pool" being used. If a datacenter uses 90% renewable but that means the city uses more fossil fuels instead of that renewable, its a wash.

I think that is a fair point. My mindset was thinking of how China is approaching AI, and especially in regards to how their also switching over to renewables over time. It just seems like that the impacts from AI would even be less the more renewable energies are used, in what contexts and how. Since outside of LLM's they been using it for lots of stuff. I recall reading of China automating a factory line to make one type of missile and it's variants for 24 hours a day.

[–] [email protected] 1 points 1 day ago* (last edited 1 day ago)

I found a YouTube link in your post. Here are links to the same video on alternative frontends that protect your privacy: