this post was submitted on 09 Jul 2025
552 points (91.6% liked)

Science Memes

15727 readers
2089 users here now

Welcome to c/science_memes @ Mander.xyz!

A place for majestic STEMLORD peacocking, as well as memes about the realities of working in a lab.



Rules

  1. Don't throw mud. Behave like an intellectual and remember the human.
  2. Keep it rooted (on topic).
  3. No spam.
  4. Infographics welcome, get schooled.

This is a science community. We use the Dawkins definition of meme.



Research Committee

Other Mander Communities

Science and Research

Biology and Life Sciences

Physical Sciences

Humanities and Social Sciences

Practical and Applied Sciences

Memes

Miscellaneous

founded 2 years ago
MODERATORS
 
top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 4 points 2 days ago

Do we honestly think OpenAI or tech bros care? They just want money. Whatever works. They're evil like every other industry

[–] [email protected] 5 points 2 days ago* (last edited 2 days ago) (2 children)

fall to my death in absolute mania, screaming and squirming as the concrete gets closer

pull a trigger

As someone who is also planning for 'retirement' in a few decades, guns always seemed to be the better plan.

[–] [email protected] 4 points 2 days ago (2 children)

Yeah, it probably would be pills of some kind to me. Honestly the only thing stopping me is that I somehow fuck it up and end up trapped in my own body.

Would be happily retired otherwise

[–] [email protected] 3 points 2 days ago (1 children)

I'm a postmortem scientist and one of the scariest things I learned in college, was that only 85% of gun suicide attempts were successful. The other 15% survive and nearly all have brain damage. I only know of 2 painless ways to commit suicide, that don't destroy the body's appearance, so they can still have funeral visitation.

[–] [email protected] 1 points 1 day ago

Why not nitrogen suffocation in a large enough bag to hold the co2?

[–] [email protected] 6 points 2 days ago

Resume by Dorothy Parker.

Razors pain you; Rivers are damp; Acids stain you; And drugs cause cramp. Guns aren’t lawful; Nooses give; Gas smells awful; You might as well live.

There are not many ways to kill one's self that don't usually end up a botched suicide attempt. Pills are a painful and horrible way to go.

[–] [email protected] 3 points 2 days ago* (last edited 2 days ago)

Dunno, the idea of 5 seconds time for whatever there is to reach you through the demons whispering in your ear contemplating when to pull the trigger to the 12gauge aimed at your face seems the most logical bad decision

[–] [email protected] 2 points 2 days ago

AI is a mistake and we would be better off if the leadership of OpenAI was sealed in an underground tomb. Actually, that's probably true of most big org's leadership.

[–] [email protected] 8 points 2 days ago (1 children)

what does this have to do with mania and psychosis?

[–] [email protected] 3 points 2 days ago (1 children)

There are various other reports of CGPT pushing susceptible people into psychosis where they think they're god, etc.

It's correct, just different articles

[–] [email protected] 1 points 22 hours ago* (last edited 22 hours ago)

ohhhh are you saying the img is multiple separate articles from separate publications that have been collaged together? that makes a lot more sense. i thought it was saying the bridge thing was symptomatic of psychosis.

yeahh people in psychosis are probably getting reinforced from LLMs yeah but tbqh that seems like one of the least harmful uses of LLMs! (except not rly, see below)

first off they are going to be in psychosis regardless of what AI tells them, and they are going to find evidence to support their delusions no matter where they look, as thats literally part of the definition. so it seems here the best outcome is having a space where they can talk to someone without being doubted. for someone in psychosis, often the biggest distressing thing is that suddenly you are being lied to by literally everyone you meet, since no one will admit the thing you know is true is actually true, why are they denying it what kind of cover up is this?! it can be really healing for someone in psychosis to be believed

unfortunately it's also definitely dangerous for LLMs to do this since you cant just reinforce the delusions, you gotta steer towards something safe without being invalidating. i hope insurance companies figure out that LLMs are currently incapable of doing this and thus must not be allowed to practice billable therapy for anyone capable of entering psychosis (aka anyone) until they resolve that issue

[–] [email protected] 5 points 2 days ago

Futurama vibes

[–] [email protected] 4 points 2 days ago

AI is the embodiment of "oh no, anyways"

[–] [email protected] 15 points 2 days ago

It took me some time to understand the problem

That’s not their job though

[–] [email protected] 20 points 2 days ago* (last edited 2 days ago)

When you go to machines for advice, it’s safe to assume they are going to give it exactly the way they have been programmed to.

If you go to machine for life decisions, it’s safe to assume you are not smart enough to know better, and- by merit of this example, probably should not be allowed to use them.

load more comments
view more: next ›