this post was submitted on 05 May 2025
435 points (95.6% liked)

Technology

70266 readers
3810 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
(page 2) 50 comments
sorted by: hot top controversial new old
[–] [email protected] 7 points 2 weeks ago

Basically, the big 6 are creating massive sycophant extortion networks to control the internet, so much so, even engineers fall for the manipulation.

Thanks DARPANets!

[–] [email protected] 46 points 2 weeks ago* (last edited 2 weeks ago) (2 children)

The article talks of ChatGPT "inducing" this psychotic/schizoid behavior.

ChatGPT can't do any such thing. It can't change your personality organization. Those people were already there, at risk, masking high enough to get by until they could find their personal Messiahs.

It's very clear to me that LLM training needs to include protections against getting dragged into a paranoid/delusional fantasy world. People who are significantly on that spectrum (as well as borderline personality organization) are routinely left behind in many ways.

This is just another area where society is not designed to properly account for or serve people with "cluster" disorders.

[–] [email protected] 16 points 2 weeks ago (5 children)

I mean, I think ChatGPT can "induce" such schizoid behavior in the same way a strobe light can "induce" seizures. Neither machine is twisting its mustache while hatching its dastardly plan, they're dead machines that produce stimuli that aren't healthy for certain people.

Thinking back to college psychology class and reading about horrendously unethical studies that definitely wouldn't fly today. Well here's one. Let's issue every anglophone a sniveling yes man and see what happens.

load more comments (5 replies)
[–] [email protected] 4 points 2 weeks ago (1 children)

yet more arguments against commercial LLMs and in favour of at home uncensored LLMs.

[–] [email protected] 2 points 2 weeks ago (1 children)
[–] [email protected] 1 points 2 weeks ago (1 children)

local LLMs won't necessarily force restrictions against de-realization spirals when the commercial ones do.

load more comments (1 replies)
[–] [email protected] 21 points 2 weeks ago* (last edited 2 weeks ago) (3 children)

From the article (emphasis mine):

Having read his chat logs, she only found that the AI was “talking to him as if he is the next messiah.” The replies to her story were full of similar anecdotes about loved ones suddenly falling down rabbit holes of spiritual mania, supernatural delusion, and arcane prophecy — all of it fueled by AI. Some came to believe they had been chosen for a sacred mission of revelation, others that they had conjured true sentience from the software.

/.../

“It would tell him everything he said was beautiful, cosmic, groundbreaking,” she says.

From elsewhere:

Sycophancy in GPT-4o: What happened and what we’re doing about it

We have rolled back last week’s GPT‑4o update in ChatGPT so people are now using an earlier version with more balanced behavior. The update we removed was overly flattering or agreeable—often described as sycophantic.

I don't know what large language model these people used, but evidence of some language models exhibiting response patterns that people interpret as sycophantic (praising or encouraging the user needlessly) is not new. Neither is hallucinatory behaviour.

Apparently, people who are susceptible and close to falling over the edge, may end up pushing themselves over the edge with AI assistance.

What I suspect: someone has trained their LLM on somethig like religious literature, fiction about religious experiences, or descriptions of religious experiences. If the AI is suitably prompted, it can re-enact such scenarios in text, while adapting the experience to the user at least somewhat. To a person susceptible to religious illusions (and let's not deny it, people are suscpecptible to finding deep meaning and purpose with shallow evidence), apparently an LLM can play the role of an indoctrinating co-believer, indoctrinating prophet or supportive follower.

[–] [email protected] 10 points 2 weeks ago

If you find yourself in weird corners of the internet, schizo-posters and "spiritual" people generate staggering amounts of text

[–] [email protected] 7 points 2 weeks ago (1 children)
[–] [email protected] 5 points 2 weeks ago

I think Elon was having the opposite kind of problems, with Grok not validating its users nearly enough, despite Elon instructing employees to make it so. :)

load more comments (1 replies)
[–] [email protected] 39 points 2 weeks ago (2 children)

I think OpenAI’s recent sycophant issue has cause a new spike in these stories. One thing I noticed was these observations from these models running on my PC saying it’s rare for a person to think and do things that I do.

The problem is that this is a model running on my GPU. It has never talked to another person. I hate insincere compliments let alone overt flattery, so I was annoyed, but it did make me think that this kind of talk would be crack for a conspiracy nut or mentally unwell people. It’s a whole risk area I hadn’t been aware of.

https://www.msn.com/en-us/news/technology/openai-says-its-identified-why-chatgpt-became-a-groveling-sycophant/ar-AA1E4LaV

[–] [email protected] 4 points 2 weeks ago

saying it’s rare for a person to think and do things that I do.

probably one of the most common flattery I see. I've tried lots of models, on device and larger cloud ones. It happens during normal conversation, technical conversation, roleplay, general testing.. you name it.

Though it makes me think.. these models are trained on like internet text and whatever, none of which really show that most people think quite a lot privately and when they feel like they can talk

[–] [email protected] 14 points 2 weeks ago (2 children)

Humans are always looking for a god in a machine, or a bush, in a cave, in the sky, in a tree… the ability to rationalize and see through difficult to explain situations has never been a human strong point.

[–] [email protected] 3 points 2 weeks ago

the ability to rationalize and see through difficult to explain situations has never been a human strong point.

you may be misusing the word, rationalizing is the problem here

[–] [email protected] 8 points 2 weeks ago (1 children)

I've found god in many a bush.

[–] [email protected] 2 points 2 weeks ago

Oh hell yeah 😎

[–] [email protected] 38 points 2 weeks ago (1 children)

This happened to a close friend of mine. He was already on the edge, with some weird opinions and beliefs… but he was talking with real people who could push back.

When he switched to spending basically every waking moment with an AI that could reinforce and iterate on his bizarre beliefs 24/7, he went completely off the deep end, fast and hard. We even had him briefly hospitalized and they shrugged, basically saying “nothing chemically wrong here, dude’s just weird.”

He and his chatbot are building a whole parallel universe, and we can’t get reality inside it.

load more comments (1 replies)
[–] [email protected] 1 points 2 weeks ago

... then they are not losing much

[–] [email protected] 35 points 2 weeks ago (3 children)

I think that people give shows like the walking dead too much shit for having dumb characters when people in real life are far stupider

[–] [email protected] 21 points 2 weeks ago (1 children)

Like farmers who refuse to let the government plant shelter belts to preserve our top soil all because they don't want to take a 5% hit on their yields... So instead we're going to deplete our top soil in 50 years and future generations will be completely fucked because creating 1 inch of top soil takes 500 years.

[–] [email protected] 16 points 2 weeks ago (2 children)

Even if the soil is preserved, we've been mining the micronutrients from it and generally only replacing the 3 main macros for centuries. It's one of the reasons why mass produced produce doesn't taste as good as home grown or wild food. Nutritional value keeps going down because each time food is harvested and shipped away to be consumed and then shat out into a septic tank or waste processing facility, it doesn't end up back in the soil as a part of nutrient cycles like it did when everything was wilder. Similar story for meat eating nutrients in a pasture.

Insects did contribute to the cycle, since they still shit and die everywhere, but their numbers are dropping rapidly, too.

At some point, I think we're going to have to mine the sea floor for nutrients and ship that to farms for any food to be more nutritious than junk food. Salmon farms set up in ways that block wild salmon from making it back inland doesn't help balance out all of the nutrients that get washed out to sea all the time, too.

It's like humanity is specifically trying to speedrun extiction by ignoring and taking for granted how things work that we depend on.

[–] [email protected] 2 points 2 weeks ago (1 children)

Why would good nutrients end up in poop?

It makes sense that growing a whole plant takes a lot of different things from the soil, and coating the area with a basic fertilizer that may or may not get washed away with the next rain doesn't replenish all of what is taken makes sense.

But how would adding human poop to the soil help replenish things that humans need out of food?

[–] [email protected] 12 points 2 weeks ago (1 children)

We don't absorb everything completely, so some passes through unabsorbed. Some are passed via bile or mucous production, like manganese, copper, and zinc. Others are passed via urine. Some are passed via sweat. Selenium, when experiencing selenium toxicity, will even pass through your breath.

Other than the last one, most of those eventually end up going down the drain, either in the toilet, down the shower drain, or when we do our laundry. Though some portion ends up as dust.

And to be thorough, there's also bleeding as a pathway to losing nutrients, as well as injuries (or surgeries) involving losing flesh, tears, spit/boogers, hair loss, lactation, finger nail and skin loss, reproductive fluids, blistering, and mensturation. And corpse disposal, though the amount of nutrients we shed throughout our lives dwarfs what's left at the end.

I think each one of those are ones that, due to our way of life and how it's changed since our hunter gatherer days, less of it ends up back in the nutrient cycle.

But I was mistaken to put the emphasis on shit and it was an interesting dive to understand that better. Thanks for challenging that :)

[–] [email protected] 7 points 2 weeks ago

Thank you for taking it in good faith and for writing up a researched response, bravo to you!

load more comments (1 replies)
[–] [email protected] 12 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

Covid gave me an extremely different perspective on the zombie apocalypse. They’re going to have zombie immunization parties where everyone gets the virus.

[–] [email protected] 6 points 2 weeks ago

People will protest shooting the zombies as well

load more comments (1 replies)
load more comments
view more: ‹ prev next ›