this post was submitted on 22 Dec 2024
483 points (95.8% liked)

Technology

60058 readers
2807 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
(page 4) 50 comments
sorted by: hot top controversial new old
[–] [email protected] 4 points 1 day ago (1 children)

Page doesn't render properly.

[–] [email protected] 54 points 1 day ago (7 children)

"Built to do my art and writing so I can do my laundry and dishes" -- Embodied agents is where the real value is. The chatbots are just fancy tech demos that folks started selling because people were buying.

[–] [email protected] 9 points 1 day ago (4 children)

Though the image generators are actually good. The visual arts will never be the same after this

load more comments (4 replies)
load more comments (6 replies)
[–] [email protected] 72 points 1 day ago* (last edited 1 day ago) (21 children)

There is this seeming need to discredit AI from some people that goes overboard. Some friends and family who have never really used LLMs outside of Google search feel compelled to tell me how bad it is.

But generative AIs are really good at tasks I wouldn't have imagined a computer doing just a few year ago. Even if they plateaued in place where they are right now it would lead to major shakeups in humanity's current workflow. It's not just hype.

The part that is over hyped is companies trying to jump the gun and wholesale replace workers with unproven AI substitutes. And of course the companies who try to shove AI where it doesn't really fit, like AI enabled fridges and toasters.

[–] [email protected] 21 points 1 day ago (21 children)

Computers have always been good at pattern recognition. This isn't new. LLM are not a type of actual AI. They are programs capable of recognizing patterns and Loosely reproducing them in semi randomized ways. The reason these so-called generative AI Solutions have trouble generating the right number of fingers. Is not only because they have no idea how many fingers a person is supposed to have. They have no idea what a finger is.

The same goes for code completion. They will just generate something that fills the pattern they're told to look for. It doesn't matter if it's right or wrong. Because they have no concept of what is right or wrong Beyond fitting the pattern. Not to mention that we've had code completion software for over a decade at this point. Llms do it less efficiently and less reliably. The only upside of them is that sometimes they can recognize and suggest a pattern that those programming the other coding helpers might have missed. Outside of that. Such as generating act like whole blocks of code or even entire programs. You can't even get an llm to reliably spit out a hello world program.

[–] [email protected] 4 points 1 day ago (3 children)

I never know what to think when I come across a comment like this one—which does describe, even if only at a surface level, how an LLM works—with 50% downvotes. Like, are people angry at reality, is that it?

load more comments (3 replies)
load more comments (20 replies)
[–] [email protected] 69 points 1 day ago (3 children)

The part that is over hyped is companies trying to jump the gun and wholesale replace workers with unproven AI substitutes. And of course the companies who try to shove AI where it doesn't really fit, like AI enabled fridges and toasters.

This is literally the hype. This is the hype that is dying and needs to die. Because generative AI is a tool with fairly specific uses. But it is being marketed by literally everyone who has it as General AI that can "DO ALL THE THINGS!" which it's not and never will be.

load more comments (3 replies)
[–] [email protected] 38 points 1 day ago (4 children)

Even if they plateaued in place where they are right now it would lead to major shakeups in humanity's current workflow

Like which one? Because it's now 2 years we have chatGPT and already quite a lot of (good?) models. Which shakeup do you think is happening or going to happen?

[–] [email protected] 9 points 1 day ago (3 children)

Computer programming has radically changed. Huge help having llm auto complete and chat built in. IDEs like Cursor and Windsurf.

I’ve been a developer for 35 years. This is shaking it up as much as the internet did.

[–] [email protected] 32 points 1 day ago* (last edited 1 day ago) (6 children)

I quit my previous job in part because I couldn't deal with the influx of terrible, unreliable, dangerous, bloated, nonsensical, not even working code that was suddenly pushed into one of the projects I was working on. That project is now completely dead, they froze it on some arbitrary version.
When junior dev makes a mistake, you can explain it to them and they will not make it again. When they use llm to make a mistake, there is nothing to explain to anyone.
I compare this shake more to an earthquake than to anything positive you can associate with shaking.

load more comments (6 replies)
[–] [email protected] -4 points 1 day ago (1 children)

Exactly this. Things have already changed and are changing as more and more people learn how and where to use these technologies. I have seen even teachers use this stuff who have limited grasp of technology in general.

load more comments (1 replies)
[–] [email protected] 32 points 1 day ago (6 children)

I hardly see it changed to be honest. I work in the field too and I can imagine LLMs being good at producing decent boilerplate straight out of documentation, but nothing more complex than that.

I often use LLMs to work on my personal projects and - for example - often Claude or ChatGPT 4o spit out programs that don't compile, use inexistent functions, are bloated etc. Possibly for languages with more training (like Python) they do better, but I can't see it as a "radical change" and more like a well configured snippet plugin and auto complete feature.

LLMs can't count, can't analyze novel problems (by definition) and provide innovative solutions...why would they radically change programming?

[–] [email protected] -4 points 1 day ago (1 children)

ChatGPT 4o isn't even the most advanced model, yet I have seen it do things you say it can't. Maybe work on your prompting.

[–] [email protected] 16 points 1 day ago (6 children)

That is my experience, it's generally quite decent for small and simple stuff (as I said, distillation of documentation). I use it for rust, where I am sure the training material was much smaller than other languages. It's not a matter a prompting though, it's not my prompt that makes it hallucinate functions that don't exist in libraries or make it write code that doesn't compile, it's a feature of the technology itself.

GPTs are statistical text generators after all, they don't "understand" the problem.

load more comments (6 replies)
load more comments (5 replies)
load more comments (2 replies)
load more comments (18 replies)
[–] [email protected] -5 points 1 day ago (2 children)

"Today’s hype will have lasting effects that constrain tomorrow’s possibilities."

Nope. No it won't. I'd love to have the patience to be more diplomatic but they're just wrong... and dumb.

I'm getting so sick of these anti AI cultists who seem to be made up of grumpy tech nerds behaving like "I was using AI before it was cool" hipsters and panicking artists and writers. Everyone needs to calm their tits right down. AI isn't going anywhere. It's giving creative and executive options to millions of people that just weren't there before.

We're in an adjustment phase right now and boundaries are being re-drawn around what constitutes creativity. My leading theory at the moment is that we'll all mostly eventually settle down to the idea that AI is just a tool. Once we're used to it and less starry eyed about it's output then individual creativity, possibly supported by AI tools, will flourish again. It's going to come down to the question of whether you prefer reading something cogitated, written, drawn or motion rendered by AI or you enjoy the perspective of a human being more. Both will be true in different scenarios I expect.

Honestly, I've had to nope out of quite a few forums and servers permanently now because all they do in there is circlejerk about the death of AI. Like this one theory that keeps popping up that image generating AI specifically is inevitably going to collapse in on itself and stop producing quality images. The reverse is so obviously true but they just don't want to see it. Otherwise smart people are just being so stubborn with this and it's, quite frankly, depressing to see.

Also, the tech nerds arguing that AI is just a fancy word and pixel regurgitating engine and that we'll never have an AGI are probably the same people that were really hoping Data would be classified as a sentient lifeform when Bruce Maddox wanted to dissassemble him in "The Measure of a Man".

How's that for whiplash?

[–] [email protected] 25 points 1 day ago (1 children)

Models are not improving, companies are still largely (massively) unprofitable, the tech has a very high environmental impact (and demand) and not a solid business case has been found so far (despite very large investments) after 2 years.

That AI isn't going anywhere is possible, but LLM-based tools might also simply follow crypto, VR, metaverses and the other tech "revolutions" that were just hyped and that ended nowhere. I can't say it will go one way or another, but I disagree with you about "adjustment period". I think generative AI is cool and fun, but it's a toy. If companies don't make money with it, they will eventually stop investing into it.

Also

Today’s hype will have lasting effects that constrain tomorrow’s possibilities

Is absolutely true. Wasting capital (human and economic) on something means that it won't be used for something else instead. This is especially true now that it's so hard to get investments for any other business. If all the money right now goes into AI, and IF this turns out to be just hype, we just collectively lost 2, 4, 10 years of research and investments on other areas (for example, environment protection). I am really curious about what makes you think that that sentence is false and stupid.

[–] [email protected] 6 points 1 day ago (3 children)

Models are not improving? Since when? Last week? Newer models have been scoring higher and higher in both objective and subjective blind tests consistently. This sounds like the kind of delusional anti-AI shit that the OP was talking about. I mean, holy shit, to try to pass off "models aren't improving" with a straight face.

load more comments (3 replies)
[–] [email protected] 5 points 1 day ago (1 children)

It's fucking fantastic news, tbh.

Here's my take, let them dismiss it.

Let em! Remember Bitcoin at $15k after 2019?

Let em! And it's justified! If Ai isn't important right now, then why should its price be inflated to oblivion? Let it fall. Good! Lower prices for those of us that do see the value down the road.

That's how speculative investment works. In no way is this bad. Are sales bad? Sit back and enjoy the show.

[–] [email protected] 13 points 1 day ago (1 children)

Are sales bad?

Of AI products? By all available metrics, yes, sales for AI driven products are atrocious.

Even the biggest name in AI is desperately unprofitable. OpenAI has only succeeded in converting 3% of their free users to paid users. To put that on perspective, 40% of regular Spotify users are on premium plans.

And those paid plans don't even cover what it costs to run the service for those users. Currently OpenAI are intending to double their subscription costs over the next five years, and that still won't be enough to make their service profitable. And that's assuming that they don't lose subscribers over those increased costs. When their conversion rate at their current price is only 3%, there's not exactly an obvious appetite to pay more for the same thing.

And that's the headline name. The key driver of the industry. And the numbers are just as bad everywhere else you look, either terrible, or deliberately obfuscated (remember, these companies sank billions of capex into this; if sales were good they'd be talking very openly and clearly about just how good they are).

load more comments (1 replies)
load more comments
view more: ‹ prev next ›