this post was submitted on 02 Jul 2024
1 points (100.0% liked)

TechTakes

1875 readers
24 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
(page 3) 15 comments
sorted by: hot top controversial new old
[–] [email protected] 0 points 10 months ago (5 children)

These fucking nerds are all so hot to create the first real life Marvel's Iron Man's JARVIS that they're willing to burn the planet down to get there.

Half of them believe that the super smart AI they build will solve the energy problem for them, they just have to somehow build it first.

Just the astounding outright hubris of it all.

[–] [email protected] 0 points 10 months ago (1 children)

When they fail, it won't be their fault of course, it'll be AI's fault.

load more comments (1 replies)
[–] [email protected] 0 points 10 months ago (5 children)

This is why philosophy should be mandatory in college (and possibly high school). Die Frage nach der Technik by Heidegger discusses this misconception that technology can solve all of our problems. He was thinking about this issue in 1954.

Music and art are also important to study. In “Faith Alone” by Bad Religion, the lyrics include these lines:

Watched the scientists throw up their hands conceding, "Progress will resolve it all"

Saw the manufacturers of earth's debris ignore another Green Peace call

Greg Graffin was discussing this in 1990.

load more comments (5 replies)
load more comments (3 replies)
[–] [email protected] 0 points 10 months ago (45 children)

there’s this type of reply guy on fedi lately who does the “well actually querying LLMs only happens in bursts and training is much more efficient than you’d think and nvidia says their gpus are energy-efficient” thing whenever the topic comes up

and meanwhile a bunch of major companies have violated their climate pledges and say it’s due to AI, they’re planning power plants specifically for data centers expanded for the push into AI, and large GPUs are notoriously the part of a computer that consumes the most power and emits a ton of heat (which notoriously has to be cooled in a way that wastes and pollutes a fuckton of clean water)

but the companies don’t publish smoking gun energy usage statistics on LLMs and generative AI specifically so who can say

[–] [email protected] 0 points 10 months ago

I always thought data centers ran clean and dirty loops of cooling (as far as computers are concerned).
The clean loop has all the chemicals and whatnot to keep cooling blocks and tubing "safe". The dirty side is just plain old water. And a big heat exchanger transfers the heat from the clean (hot) loop to the "dirty" (cold) side.
Is there really that much pollution in that? Can't be worse than rain going through storm drains or whatever.

But AI does use a phenomenal amount of power.
And, IMO, it's a problem compared to the lack of value people are getting from AI.
The new Blackwell B200 consumes 1.2kw of power, and will produce 1.2kw of heat.
A cooling system with a COP of 5 needs to consume 240w to dissipate this.
The backplane for the B200 holds 8 of these GPUs in a 10 RU space, and with overheads will peak 14.3kw (cooling would be 3kw consumption).
So, a 42u data center rack with 3 of these, supporting hardware and UPS efficiencies (80%) is going to be 52kw (+10kw cooling). 62kw total, which is like 4 homes drawing their full load all the time.

I hope they finally find an application for AI, instead of just constantly chasing the dragon with more training, more parameters, more performance etc.

[–] [email protected] 0 points 10 months ago (1 children)

“It only uses 5x as much energy as a regular search! Think of how much energy YOU’RE using with searches!” Okay, so you’re just using 5x as much energy for worse results? And also probably doing it more often than people who just use a normal search engine, because they don’t expect the search engine to talk to them. I’ve never understood how that was supposed to be an exoneration for it, even without taking into account that nobody ever seems to know whether or not that figure includes energy spent on training.

[–] [email protected] 0 points 10 months ago

AI bros use literally the same whatabout excuses for their ghastly power consumption that I know from years of bitcoin bros doing the same

like, at least christmas lights bring joy

[–] [email protected] 0 points 10 months ago* (last edited 10 months ago) (2 children)

The last part is absolutely false. The Nvidia H100 TDP is like 700W, though ostensibly configurable. The B200 is 1000W. The AMD MI300X is 750W.

They also skimp on RAM with many SKUs so you have to buy the higher clocked ones.

They run in insane power bands just to eek out a tiny bit more performance. If they ran at like a third of their power, I bet they would be at least twice as power efficient, and power use scales over nonlinearly with voltage/clock speed.

But no, just pedal to the metal. Run the silicon as hard as it can, and screw power consumption.

[–] [email protected] 0 points 10 months ago (2 children)

Other AI companies like Cerebras are much better, running at quite sane voltages. Ironically (or perhaps smartly), the Saudis invested in them.

it’s real bizarre you edited this in after getting upvoted by a few people

[–] [email protected] 0 points 10 months ago (1 children)

banned them for the subtle spam attempt

load more comments (1 replies)
[–] [email protected] 0 points 10 months ago (2 children)

You never post, second guess yourself, and research? Really easy to explain.

[–] [email protected] 0 points 10 months ago

I usually append them at the end of the comment preceded by "Edit:" for transparency.

[–] [email protected] 0 points 10 months ago (1 children)

do the results of your personal research frequently look like marketing horseshit?

[–] [email protected] 0 points 10 months ago* (last edited 10 months ago)

(notably posted at a 7min delta after the other comment with oh so specific details, and just entirely dismissing the man behind the curtain as to the plurality of compute involved)

[–] [email protected] 0 points 10 months ago

wow, exemplary performance

load more comments (42 replies)
load more comments
view more: ‹ prev next ›