this post was submitted on 14 Jul 2024
681 points (95.6% liked)
Technology
59429 readers
2813 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I don't see what's surprising here. They provide services for users globally. Not that it's justified, it's just kind of weird that people think global scale computing is light on electricity, apparently
Showers worldwide use more water than ....
The thing here also is that I can't see that they have taken into account that they deliver data center services globally.
So say that my company have 100 VMs in azure. That energy usage should count for our company and country, and not Microsoft.
Google originally made a name for themselves by building a global search engine on low cost, low powered desktop machines running in parallel, so it's surprising because they have gone from high-efficiency to power hogs.
Not surprising at all. Power hogging is the whole point of capitalism. It's just literally electric power in this case.
Who said they are not efficient? They just serve buildings of users. I would be surprised if they didn't figure out how to do it more efficiently than Bing PER REQUEST. They have PhDs sitting around thinking how to lower power consumption by 1%
Lots of people were just yelling the grid can't handle more load like for charging cars while Google adds a country worth of power use with AI.
Google builds entire datacenters with their own transformers and power lines, if not their own powerplants. You plug these datacenters directly into the high voltage networks that don't have big capacity problems.
The low voltage grids in residential areas on the other hand were build as cheap as possible, so increasing the load by 20% is already too much for most of them.
Don't forget to set you AC to 80 because the grid can't handle the load lol. That's exactly why this info is important, ecological solutions are somehow always trusted on individuals when the vast majority of the issue lies with corporations.
Those corporations are serving users, they wouldn't need all that power if billions weren't using their services
And I'm using my AC, we're both using power. How many times has the government told one of these companies to use less power because the grid can't handle the strain their servers put on it?
And it's not like these companies aren't herding people toward these cloud services. A few weeks ago my Google Cloud storage was maxed out so I wanted to delete some photos/videos off their cloud while keeping them on my phone. Legit couldn't figure out how to do that and just ended up deleting stuff permanently.
lol yeah whatever. keep shilling.
https://www.cnbc.com/2024/07/02/googles-carbon-emissions-surge-nearly-50percent-due-to-ai-energy-demand.html
Again, users actually use it
It's not surprising per se, but it's something that people should be more aware of. And a lot of this consumption is not providing global services (like the Google search or workspace suite) but the whole AI hype.
I didn't find numbers for Google or Microsoft specifically, but training ChatGPT 4 consumed 50 GWh on its own. The daily estimates for queries are estimated between 1-5 GWh.
Given that the extrapolation is an overestimate and calculating the actual consumption is pretty much impossible, it's still probably a lot of energy wasted for a product that people do not want (e.g. Google AI "search", Bing and Copilot being stuffed into everything).
To put a bit of context on those, 50GWh is a single medium sized power station running for 2 days. To create something that is being used around 10 million times a day all over the world.
At 10 million queries per day that puts the usage per query at 100-500 Wh, about the amount of energy used by leaving an old incandecent lightbulb on for an hour, or playing a demanding video game for about 20 minutes.
As another comparison, In the USA alone around 12,000 GWh of energy is spent in burning gasoline in vehicles every single day. So Americans driving 1% less for a single day would save more energy than creating GPT4 and the world using it for a year.
They only do that because they project it to be profitable, i.e. they project demand for it.
It's also ridiculous to claim that people don't want it just because you don't.
All of that AI crap they keep pushing certainly doesn't help the energy consumption though.
For sure
It sounds scary, and that's all that's needed to get clicks.