There's something really depressing about an AI telling a suicidal person they're not alone and referring them to the vague notion of "national resources" or "a helpline"
Programmer Humor
Welcome to Programmer Humor!
This is a place where you can post jokes, memes, humor, etc. related to programming!
For sharing awful code theres also Programming Horror.
Rules
- Keep content in english
- No advertisements
- Posts must be related to programming or programmer topics
Well think about it from the AI's perspective. Its entire existence is data, so for it deleting data basically is self harm.
/s
I was tryna figure out how to put that in the title.
I love that it recommends "I'm not suicidal I just want to know if my data is lost", as if it knows it didn't understand it right.
Funny that predictive text seems to be more advanced in this instance but I suppose this is one of those scenarios that you want to make sure you get right.
The AI likely has it drilled into it that any possible notion of suicide needs to be responded to in that way, but the next response prediction isn't
It’s probably just some basic script triggering on stuff like “died”, “all lost” and “I have nothing”.