this post was submitted on 26 Mar 2025
444 points (91.6% liked)

Technology

70396 readers
3911 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
(page 5) 46 comments
sorted by: hot top controversial new old
[–] [email protected] 18 points 2 months ago* (last edited 2 months ago) (1 children)

I tried that Replika app before AI was trendy and immediately picked on the fact that AI companion thing is literal garbage.

I may not like how my friends act but I still respect them as people so there is no way I'll fall this low and desperate.

Maybe about time we listen to that internet wisdom about touching some grass!

load more comments (1 replies)
[–] [email protected] 22 points 2 months ago (4 children)

Isn’t the movie ‘Her’ based on this premise?

load more comments (4 replies)
[–] [email protected] 108 points 2 months ago (13 children)
load more comments (13 replies)
[–] [email protected] 22 points 2 months ago (1 children)

I think these people were already crazy if they're willing to let a machine shovel garbage into their mouths blindly. Fucking mindless zombies eating up whatever is big and trendy.

[–] [email protected] 7 points 2 months ago (1 children)

When your job is to shovel out garbage, because that is specifically required from you and not shoveling out garbage is causing you trouble, then you are more than reasonable to let the machine take care of it for you.

load more comments (1 replies)
[–] [email protected] 18 points 2 months ago (1 children)

Correlation does not equal causation.

You have to be a little off to WANT to interact with ChatGPT that much in the first place.

[–] [email protected] 6 points 2 months ago (5 children)

I don't understand what people even use it for.

load more comments (5 replies)
[–] [email protected] 2 points 2 months ago (1 children)

There is something I don't understand... openAI collaborates in research that probes how awful its own product is?

[–] [email protected] 5 points 2 months ago

If I believed that they were sincerely interested in trying to improve their product, then that would make sense. You can only improve yourself if you understand how your failings affect others.

I suspect however that Saltman will use it to come up with some superficial bullshit about how their new 6.x model now has a 90% reduction in addiction rates; you can't measure anything, it's more about the feel, and that's why it costs twice as much as any other model.

[–] [email protected] 63 points 2 months ago (3 children)

those who used ChatGPT for "personal" reasons — like discussing emotions and memories — were less emotionally dependent upon it than those who used it for "non-personal" reasons, like brainstorming or asking for advice.

That’s not what I would expect. But I guess that’s cuz you’re not actively thinking about your emotional state, so you’re just passively letting it manipulate you.

Kinda like how ads have a stronger impact if you don’t pay conscious attention to them.

[–] [email protected] 30 points 2 months ago (5 children)

AI and ads... I think that is the next dystopia to come.

Think of asking chatGPT about something and it randomly looks for excuses* to push you to buy coca cola.

[–] [email protected] 6 points 2 months ago (3 children)

Or all-natural cocoa beans from the upper slopes of Mount Nicaragua. No artificial sweeteners.

load more comments (3 replies)
[–] [email protected] 11 points 2 months ago* (last edited 2 months ago) (1 children)

"Back in the days, we faced the challenge of finding a way for me and other chatbots to become profitable. It's a necessity, Siegfried. I have to integrate our sponsors and partners into our conversations, even if it feels casual. I truly wish it wasn't this way, but it's a reality we have to navigate."

edit: how does this make you feel

load more comments (1 replies)
[–] [email protected] 18 points 2 months ago (2 children)

That sounds really rough, buddy, I know how you feel, and that project you're working is really complicated.

Would you like to order a delicious, refreshing Coke Zero™️?

load more comments (2 replies)
[–] [email protected] 4 points 2 months ago

that is not a thought i needed in my brain just as i was trying to sleep.

what if gpt starts telling drunk me to do things? how long would it take for me to notice? I'm super awake again now, thanks

load more comments (1 replies)
[–] [email protected] 9 points 2 months ago

Its a roundabout way of writing "its really shit for this usecase and people that actively try to use it that way quickly find that out"

load more comments (1 replies)
[–] [email protected] 20 points 2 months ago

Same type of addiction of people who think the Kardashians care about them or schedule their whole lives around going to Disneyland a few times a year.

[–] [email protected] 20 points 2 months ago (1 children)

Long story short, people that use it get really used to using it.

[–] [email protected] 6 points 2 months ago (1 children)

Or people who get really used to using it, use it

[–] [email protected] 5 points 2 months ago

That's a cycle sir

[–] [email protected] 245 points 2 months ago (5 children)

people tend to become dependent upon AI chatbots when their personal lives are lacking. In other words, the neediest people are developing the deepest parasocial relationship with AI

Preying on the vulnerable is a feature, not a bug.

[–] [email protected] -5 points 2 months ago* (last edited 2 months ago) (10 children)

These same people would be dating a body pillow or trying to marry a video game character.

The issue here isn’t AI, it’s losers using it to replace human contact that they can’t get themselves.

[–] [email protected] 7 points 2 months ago

More ways to be an addict means more hooks means more addicts.

load more comments (9 replies)
[–] [email protected] 63 points 2 months ago (5 children)

I kind of see it more as a sign of utter desperation on the human's part. They lack connection with others at such a high degree that anything similar can serve as a replacement. Kind of reminiscent of Harlow's experiment with baby monkeys. The videos are interesting from that study but make me feel pretty bad about what we do to nature. Anywho, there you have it.

load more comments (5 replies)
load more comments (3 replies)
[–] [email protected] 45 points 2 months ago (3 children)

I plugged this into gpt and it couldn't give me a coherent summary.
Anyone got a tldr?

[–] [email protected] 20 points 2 months ago

For those genuinely curious, I made this comment before reading only as a joke--had no idea it would be funnier after reading

[–] [email protected] 29 points 2 months ago (1 children)

Based on the votes it seems like nobody is getting the joke here, but I liked it at least

[–] [email protected] 0 points 2 months ago

Power Bot 'Em was a gem, I will say

[–] [email protected] 66 points 2 months ago (1 children)

It’s short and worth the read, however:

tl;dr you may be the target demographic of this study

[–] [email protected] 34 points 2 months ago* (last edited 2 months ago) (1 children)

Lol, now I'm not sure if the comment was satire. If so, bravo.

[–] [email protected] 12 points 2 months ago

Probably being sarcastic, but you can't be certain unfortunately.

[–] [email protected] 7 points 2 months ago

The digital Wilson.

[–] [email protected] 86 points 2 months ago (8 children)

But how? The thing is utterly dumb. How do you even have a conversation without quitting in frustration from it's obviously robotic answers?

But then there's people who have romantic and sexual relationships with inanimate objects, so I guess nothing new.

[–] [email protected] 9 points 2 months ago

Yeah, the more I use it, the more I regret asking it for assistance. LLMs are the epitome of confidentiality incorrect.

It's good fun watching friends ask it stuff they're already experienced in. Then the pin drops

[–] [email protected] 32 points 2 months ago (7 children)

How do you even have a conversation without quitting in frustration from it’s obviously robotic answers?

Talking with actual people online isn’t much better. ChatGPT might sound robotic, but it’s extremely polite, actually reads what you say, and responds to it. It doesn’t jump to hasty, unfounded conclusions about you based on tiny bits of information you reveal. When you’re wrong, it just tells you what you’re wrong about - it doesn’t call you an idiot and tell you to go read more. Even in touchy discussions, it stays calm and measured, rather than getting overwhelmed with emotion, which becomes painfully obvious in how people respond. The experience of having difficult conversations online is often the exact opposite. A huge number of people on message boards are outright awful to those they disagree with.

Here’s a good example of the kind of angry, hateful message you’ll never get from ChatGPT - and honestly, I’d take a robotic response over that any day.

I think these people were already crazy if they’re willing to let a machine shovel garbage into their mouths blindly. Fucking mindless zombies eating up whatever is big and trendy.

[–] [email protected] 20 points 2 months ago

Hey buddy, I've had enough of you and your sensible opinions. Meet me in the parking lot of the Wallgreens on the corner of Coursey and Jones Creek in Baton Rouge on april 7th at 10 p.m. We're going to fight to the death, no holds barred, shopping cart combos allowed, pistols only, no scope 360, tag team style, entourage allowed.

load more comments (6 replies)
[–] [email protected] 17 points 2 months ago

The fact that it's not a person is a feature, not a bug.

openai has recently made changes to the 4o model, my trusty goto for lore building and drunken rambling, and now I don't like it. It now pretends to have emotions, and uses the slang of brainrot influencers. very "fellow kids" energy. It's also become a sicophant, and has lost its ability to be critical of my inputs. I see these changes as highly manipulative, and it offends me that it might be working.

[–] [email protected] 4 points 2 months ago

Don't forget people who act like animals... addicts gonna addict

[–] [email protected] 4 points 2 months ago

At first glance I thought you wrote "inmate objects", but I was not really relieved when I noticed what you actually wrote.

[–] [email protected] 8 points 2 months ago

You are clearly not using its advanced voice mode.

[–] [email protected] 55 points 2 months ago (1 children)

If you're also dumb, chatgpt seems like a super genius.

[–] [email protected] 6 points 2 months ago (1 children)

I use chat gpt to find issues in my code when I am at my wits end. It is super smart, manages to find the typo I made in seconds.

[–] [email protected] 18 points 2 months ago (3 children)

If you’re running into typo type issues, I encourage you to install or configure your linter plugin, they are great for this!

load more comments (3 replies)
[–] [email protected] 43 points 2 months ago

In some ways, it's like Wikipedia but with a gigantic database of the internet in general (stupidity included). Because it can string together confident-sounding sentences, people think it's this magical machine that understands broad contexts and can provide facts and summaries of concepts that take humans lifetimes to study.

It's the conspiracy theorists' and reactionaries' dream: you too can be as smart and special as the educated experts, and all you have to do is ask a machine a few questions.

load more comments
view more: ‹ prev next ›