this post was submitted on 25 Jul 2024
1009 points (97.5% liked)

Technology

59161 readers
1940 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

The new global study, in partnership with The Upwork Research Institute, interviewed 2,500 global C-suite executives, full-time employees and freelancers. Results show that the optimistic expectations about AI's impact are not aligning with the reality faced by many employees. The study identifies a disconnect between the high expectations of managers and the actual experiences of employees using AI.

Despite 96% of C-suite executives expecting AI to boost productivity, the study reveals that, 77% of employees using AI say it has added to their workload and created challenges in achieving the expected productivity gains. Not only is AI increasing the workloads of full-time employees, it’s hampering productivity and contributing to employee burnout.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 0 points 3 months ago* (last edited 3 months ago) (1 children)

Your link is just about Google's energy use, still says it uses a vast amount of energy, and says that A.I. is partially responsible for climate change.

It even quotes that moron Altman saying that there's not enough energy to meet their needs and something new needs to be developed.

I have no idea why you think this supports your point at all.

[–] [email protected] 2 points 3 months ago (1 children)

Artificial intelligence requires a lot of power for much the same reason. The kind of machine learning that produced ChatGPT relies on models that process fantastic amounts of information, and every bit of processing takes energy. When ChatGPT spits out information (or writes someone’s high-school essay), that, too, requires a lot of processing. It’s been estimated that ChatGPT is responding to something like two hundred million requests per day, and, in so doing, is consuming more than half a million kilowatt-hours of electricity. (For comparison’s sake, the average U.S. household consumes twenty-nine kilowatt-hours a day.)

That was the only bit I was referring to for a source for 0.5GWh energy usage per day for GPT, I agree what Altman says is worthless, or worse deliberately manipulative to keep the VC money flowing into openAI.

[–] [email protected] -1 points 3 months ago

I see, so if we ignore the rest of the article entirely, your point is supported. What an odd way of trying to prove a point.

Also, I guess this was a lie:

Ok, you just want to shout not discuss so I wont engage any further.

Although since it was a lie, I'd love you to tell me what you think I was shouting about.