this post was submitted on 19 May 2025
51 points (98.1% liked)

askchapo

22998 readers
127 users here now

Ask Hexbear is the place to ask and answer ~~thought-provoking~~ questions.

Rules:

  1. Posts must ask a question.

  2. If the question asked is serious, answer seriously.

  3. Questions where you want to learn more about socialism are allowed, but questions in bad faith are not.

  4. Try [email protected] if you're having questions about regarding moderation, site policy, the site itself, development, volunteering or the mod team.

founded 4 years ago
MODERATORS
 

Either that or taking like an entire lake worth of ocean per generation response. I see this a lot on hexbear and I’m genuinely curious if I haven’t been misinformed. But how true are the climate change impacts about LLMs? Are they really burning a forest, taking a lake worth of water true, or is this just hyperbole? Because looking into this somewhat, I see contradictions? Alex Avila pointed out a lot of these contradictions and I am going to use his what he said on this, from his video AI Wars: How Corporations Hijacked Anti-AI Backlash at around 2:40:42

Anyways when, for example if I go into a random article, like this NPR one where the writer cites Goldman Sachs for this claims

According to a report by Goldman Sachs, a ChatGPT query needs nearly 10 times as much electricity as a Google search query.

Goldman Sachs has researched the expected growth of data centers in the U.S. and estimates they’ll be using 8% of total power in the country by 2030, up from 3% in 2022. Company analysts say “the proliferation of AI technology, and the data centers necessary to feed it” will drive a surge in power demand “the likes of which hasn’t been seen in a generation.”^[NPR: AI brings soaring emissions for Google and Microsoft, a major contributor to climate change]

First off, I didn’t know Goldman Sachs is a research institution? I thought they were a financial and banking institution. Which is strange that NPR is citing them. But NPR one of many yes? So to go to a different article from UNEP.

Third, data centres use water during construction and, once operational, to cool electrical components. Globally, AI-related infrastructure may soon consume six times more water than Denmark, a country of 6 million, according to one estimate. That is a problem when a quarter of humanity already lacks access to clean water and sanitation.

Finally, to power their complex electronics, data centres that host AI technology need a lot of energy, which in most places still comes from the burning of fossil fuels, producing planet-warming greenhouse gases. A request made through ChatGPT, an AI-based virtual assistant, consumes 10 times the electricity of a Google Search, reported the International Energy Agency. While global data is sparse, the agency estimates that in the tech hub of Ireland, the rise of AI could see data centres account for nearly 35 per cent of the country’s energy use by 2026^[UNEP: AI has an environmental problem. Here’s what the world can do about that. ]

or one from The Commons.

According to the IEA, while a single Google search takes 0.3 watt-hours of electricity, a ChatGPT request takes 2.9 watt-hours.”^[The Commons: Understanding AI's environmental footprint]

or Axios.

One oft-cited rule of thumb suggested that querying ChatGPT used roughly 10 times more energy than a Google search — 0.3 watt-hours for a traditional Google search compared with 2.9 watt-hours for a ChatGPT query.”^[Axios: AI's climate impact is still a black box]

So to look at the Goldman Sach study^[Goldman Sach: AI is poised to drive 160% increase in data center power demand] You find this claim.

A single ChatGPT query requires 2.9 watt-hours of electricity, compared with 0.3 watt-hours for a Google search, according to the International Energy Agency.

Who cites the International Energy Agency, much like the commons. In particular they cite this study^[IEA: Electricity 2024 Analysis and forecast to 2026]

That mentions this

Market trends, including the fast incorporation of AI into software programming across a variety of sectors, increase the overall electricity demand of data centres. Search tools like Google could see a tenfold increase of their electricity demand in the case of fully implementing AI in it. When comparing the average electricity demand of a typical Google search (0.3 Wh of electricity) to OpenAI’s ChatGPT (2.9 Wh per request), and considering 9 billion searches daily, this would require almost 10 TWh of additional electricity in a year.

And for that figure the IEA cites this paper by De Vries^[The growing energy footprint of artificial intelligence]

The Axios article links to a different study, but that study links back to the De Vries paper. So it’s interesting how a lot of these lead to De Vries paper. To quote the relevant portion from De Vries

Alphabet’s chairman indicated in February 2023 that interacting with an LLM could “likely cost 10 times more than a standard keyword search.6" As a standard Google search reportedly uses 0.3 Wh of electricity,9 this suggests an electricity consumption of approximately 3 Wh per LLM interaction.

Alex Avila points out how nonsensical this in his video, around at 2:46:45. He also points out the Goldman Sachs connection another other financial capitalist connections to this later in that video.

Mainly he just points out how that 10 time more than a standard keyword search is a financial cost not an energy one, and that the 0.3 is an energy cost. And that it’s nonsense to take 0.3 and times it by 10 from that guess that’s a financial cost to get something entirely new. Which I agree with and that 3 WH per LLM makes no sense, especially since it based off google keyword search, not LLM usage. Alex also points out that google keyword search from 2009 and a lot has changed since then.

In Alex’s video he goes onto talk more about the issues with De Vries study, but just based off this alone. Those articles above are getting something wrong. I know the articles mention more so I'll just bring up Alex's other points since I think he did a good investigation on this.

One of the things Alex points out is how like, a lot of actual energy use by AI companies and data companies aren’t really transparent so it is hard for us to know. So how can we really know how much is used?

Another thing is to go back to Alex, in his video he mentions this study The carbon emissions of writing and illustrating are lower for AI than for humans

which argues that AI costs are very low, like in this graph.

Which goes against what I hear others say or like those articles above. Is there something wrong with that study?

Also another thing worth mentioning is, again I’m referring to a lot of things Alex said, is Data Centers only take up 1-2% of electricity use. Another article saying this to.

Around the globe, data centers currently account for about 1 to 1.5 percent of global electricity use, according to the International Energy Agency.^[Scientific American: The AI Boom Could Use a Shocking Amount of Electricity]

and the IEA link in that article is different since they linked to Data Centres and Data Transmission Networks

What interesting about that that IEA page they linked, is what they say here.

Data centres and data transmission networks are responsible for 1% of energy-related GHG emissions

Anyways Alex also mentions that overall data centers and AI only make up a very small fraction, which I think is a fair point. How has AI really changed anything in regards to climate change when it still the same issues at hand? Last I check the U.S military is still one of the largest polluters. Vehicles also take up more pollution. Agriculture. Yet somehow AI is out contributing all of these? I know energy also takes up a portion to, but as been pointed out, to go to data centers it only takes up less than 2% of total energy use, and AI only a fraction of that of that 2% that expected to grow, since data centers are used for other things besides LLM’s.

Also worth mentioning is the idea that increase energy use might be overestimated as well. Going back to that video at 2:57:00, I think it’s worth watching about the issues with the IEA report on this stuff.

Besides that, another thing mention by Alex and others is water usage, but it been pointed out a lot of water is recycled in data centers? Along with some data centers using things like waste water. Also one thing Alex points out that there is less water use by things like chatgpt referring to this paper Making AI Less “Thirsty”: Uncovering and Addressing the Secret Water Footprint of AI Models that says

Additionally, GPT-3 needs to “drink” (i.e., consume) a 500ml bottle of water for roughly 10 – 50 medium-length responses, depending on when and where it is deployed.”

Also a really good point Alex made is how one single hamburger has more of a water usage at 1,695 liters of water for one hamburger. Another thing but from what I understand, training an AI does use a lot of energy, but when someone interacting with an AI, interference I think, the energy cost is way less.

Anyways am I'm just wondering if just being mislead or getting some things wrong? Since it just seems like, the effects climate change effects from AI is rather overblown? I think Alex did really good investigation into this. I am being genuine here since to me, a lot of AI stuff is over hyped, and I feel there a bit of some reactionary sentiment to it, treating it as a scapegoat for everything wrong. Leading to over hyping it and contribute more to that over hype.

There are some valid issues with AI, and besides that, China is showing proper uses for it, and it also helps how for the last few years they been using more renewable energies, which cut out a lot of the emissions in regards for AI stuff no? In regards to use for electricity.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 6 points 1 day ago

Half of the fun of the novelty was how scuffed it was too. Hasn't even gotten that much better either imo it's just being shoved in more places it shouldn't be.