this post was submitted on 12 Jul 2024
202 points (91.7% liked)
Technology
59405 readers
2727 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I don't know if they're conflating rendering with display or just assuming those GPU are at max TDP 24/7, but they're way off on actual energy consumption.
There seems to be a lot of recent articles attacking datacenters, particularly those involved in LLM "ai" work. This feels like one of those articles.
I'm not saying we shouldn't keep them in check, but I also don't like being manipulated by "grass roots initiative" marketing companies, particularly on Lemmy.
The AI numbers are pretty solid. Papers published on Hugging face list training times and platform and convert that into CO2. Those will be full load for weeks/months across arrays of GPUs.
In this case, I don't see why you'd need that kind of hardware for this application. You might be right that it's not running at maximum load. If so, then somebody has been mis-sold the hardware. Whatever you're doing it will be at a consistent load though. They are always doing the same thing.
It's probably that only their professional tier cards are built to handle synchronization on that scale. There are obviously other massive displays out there, but they're also using specialized and expensive hardware to handle all the signal processing.