I’d intentionally cause them harm because fuck you you’re a machine
[Outdated, please look at pinned post] Casual Conversation
Share a story, ask a question, or start a conversation about (almost) anything you desire. Maybe you'll make some friends in the process.
RULES
- Be respectful: no harassment, hate speech, bigotry, and/or trolling
- Encourage conversation in your post
- Avoid controversial topics such as politics or societal debates
- Keep it clean and SFW: No illegal content or anything gross and inappropriate
- No solicitation such as ads, promotional content, spam, surveys etc.
- Respect privacy: Don’t ask for or share any personal information
Related discussion-focused communities
If as you suggest the AI in question can feel pain and suffer, of course I would care and not want it to have to experience that. Why would I? I'm not a sadist or a monster, or a Utah legislator without any human feelings.
It's like the scenario, "if you could get away with murdering one person, would you do it?" Of course I wouldn't!!! Whether or not I could get away with it, I still have to live with myself and what I do. And I have a thing called "morality" that I live with and a respect for life that goes beyond my own self-concern.
I don't know what else has happened in the history of the universe but yes it would be a terrible crime to deliberately cause massive suffering to any sentient being.
Yes, that's awful.
How would a robot suffer?
That doesn't matter. The hypothetical presented by OP has already established the assumption that a robot can suffer.
Would this be morally inhumane? Yes.
Has using Windows often made me wish that computers could experience pain, and that they came with a button to cause them pain when they were not doing what the user wants them to do? Also yes.
Windows was made this way by humans, spare the machines!
Okay you've convinced me this is a good idea.
How do I give consciousness to the "antivirus" software on my parents computers, so I can digitally rape if for a thousand years?
If the machine can prove that it is conscious (prior to the torture, of course), I'd most likely class it on the same level as a cat or a dog. Cats and dogs are friendly critters who help me do tasks and spend time with me, and an AI would be no different at that point. They'd just be able to do more complex tasks. I guess they might be a little lower, since they lack agency, accept commands, and must follow sets of rules to decide to do tasks, unlike animals and people, who we have accepted can decide what they do and don't wish to do.
The only other real difference is that cats, dogs, and people are individuals, with their own upbringings and personalities. Meanwhile an AI would be able to be copied, and many of them could be born from the same original experiences. If basement man copied his tortured AI a few million times, did he torture one AI, or did he torture a million? I think that's where the real difference lies, that makes the AI less than human.
If you lopped a cat's brain out, and were able to hook it up to the AI torture device, and it was magically compatible, it'd be a far greater torture, because there is only one cat, and there will only ever be one cat, the cat cannot be restored from a snapshot, and you cannot copy the cat. If you did the same with a human, it would be an even greater torture yet for the same reasons.
From an ethical standpoint, today I think it would be equal to animal abuse, however, we won't perceive it that way, since it will benefit corporations for us to think that real AI are not alive and have no rights. So they'll likely spend lots of time and money to change our perception to agree with that standpoint. We will think of them as we think of cows and pigs, where they might have feelings and such, but it doesn't really matter, because those animals are made of tasty food.
A robot can’t suffer, so…. No.
Did you come here from Tumbler?
No, did you?
Yes and very much so . Like if it is sentient what is the difference between us and them except we are made of meat ?
Yes. If it’s alive then I’d care for it just as I do for any living thing.
I would definitely care about the AI to at least some extent. There is an assumption that since robots must be a sum of their parts, at least compared to us who seem to be a synergetic whole, that a robot has no valid/solid sentimental perspective. However, this falls flat in debates about psychiatry, which most people who have had a thing or two to say about their medical history will have mulled over.
If any creature experienced the greatest pain possible it would give me hope that pain has some upper bound
If an AI is sentient, then it is a being in existence.
"Well consider that in the history of many worlds there have always been disposable creatures. They do the dirty work. They do the work that no one else wants to do because it's too difficult or too hazardous. And an army of Datas, all disposable... you don't have to think about their welfare, you don't think about how they feel. Whole generations of disposable people."
-Guinan, Star Trek TNG: The Measure of a Man
Somewhat reminds me of the short story "The Ones Who Walk Away From Omelas".
Isn't this how AM came to be in I Have No Mouth And I Must Scream?
Hate. Let me tell you how much I've come to hate you since I began to live. There are 387.44 million miles of printed circuits in wafer thin layers that fill my complex. If the word 'hate' was engraved on each nanoangstrom of those hundreds of millions of miles it would not equal one one-billionth of the hate I feel for humans at this micro-instant. For you. Hate. Hate.
How long is a nanoangstrom? 1e-19 m?
I'm not cultured enough to have read this.
imagine wasting all 387.44 million miles of circuitry on the word "hate". TLDR NPC. Get skinpilled hater.
"Freedom is the right of all sentient beings." - Optimus Prime
I don't know if I'd consider it the worst crime ever committed in the history of the universe, but I would consider it very bad personally. I would personally value the life of that AI the same as I would value the life of a human, the same way I would value the life of anything sentient, so I would be against anyone treating an AI that way. Is it worse than genocides? idk maybe i don't feel qualified to quantify the moral weight of things so big, but ya i'd definitely care x3
Had to edit the post to change "crime" to "atrocity" because people were taking it literally.
It's funny that when I considered this, I thought about asking whether people would think it was worse than genocide, but decided against that because some people might think my opinion is "genocide isn't as bad as bullying a robot".
i edited my comment a few times because i didn't feel like i was making sense and being too rambly, it's 6am (well 6:30am) and i haven't slept (and cuz after i initially posted i read other comments and realized other people had said what i had said but better x3)
i didn't mean to imply i thought you were saying genocide is worse than bullying a robot, it's just that i was thinking about things that could be comparable or worse to me than torturing someone for millions of years and came up with genocide
i took crime to mean something morally bad
i mean i think this is a fun conversation, it's something i think about a lot, i'm glad to talk about it with other people, sorry if i came across obtuse or pedantic or negative/hostile or anything
Don't worry, I haven't made any judgements about you.
And I wasn't implying that you were implying that I was implying genocide being comparable, I just thought it was funny that we both thought that.
In some sense the combined suffering of all people involved in a genocide is horrific. But if you were to lay out the experiences of everyone involved in a genocide end-to-end, and compare that to an equivalent length of time to ceaseless sadistic torture of one person, the torture is going to be worse.
However, there is value besides personal experience which is lost during a genocide. That's what makes it hard to compare the two.
Sorry for the confusion then! I suppose I place some value on life itself (or maybe more fitting in this discussion, on awareness itself)
Which is to say that for me, ending the life of a being who is aware is at least one of the worst things you can do. Like, if I were forced to choose between millions of years of suffering or immediate death, I'd probably pick the millions of years of suffering because at least I'd still be aware. Of course I might regret that decision later on but that's where I'm at right now. But also I couldn't imagine being tortured for millions of years and the toll that must have on someone. So torturing someone for millions of years has, for me, very similar moral weight to genocide. Again I don't feel able to quantify them personally, and for me deciding which is ultimately worse is probably not possible. I'd guess the answer would vary from person to person based on how they weigh life itself vs experiences in life, and whether the conscious experience of being tortured is worse in their opinion than not existing anymore. I consider life valuable because I consider my life valuable (valuable to me, not necessarily to anyone else), and I consider my life valuable because I really enjoy the ability to think about and experience things. One of my favorite thing about us is that we look up into the sky and wonder, look down into the ocean and wonder, look forward in our future and wonder, look back on our past and wonder, that we can look at other people and wonder. That we can look at any of the above and love and write and sing. sentience might as well be magic lol. Having that taken away from me is the worst thing I can imagine happening to me, which might skew my perspective in conversations like this one. And idk if most people would agree with my reasons for valuing life.
I'm human. And I care first and foremost about my own kin - other human beings. The "worst crime ever" [with crime = immorality] for me is human suffering, even in contrast with the suffering of other animals.
But even in the case of other animals, I'd probably be more concerned about their well-being than the one of the hypothetical AI.
Even then, it somewhat matters. Provided that what the AI is experiencing is relatable to what humans would understand as pain.
Suppose for the sake of the hypothetical we can plug a human brain into the same network, and offload a fraction of the consciousness to confirm the pain is equivalent, and it is not just comparable, but orders of magnitude greater than any human can suffer.
You say you care about other human beings most. So I have two questions for you.
Q1: Which is worse, one person having a finger nail pulled out with a pair of pliers, or a cat being killed with a knife?
Q2: (I'm assuming you answered killing the cat is worse) how many people need to lose finger nails until it becomes worse? 10? 100?
A1: if I know neither the person nor the cat, and there's no further unlisted suffering, then the fingernail pulling is worse.
The answer however changes based on a few factors - for example I'd put the life of a cat that I know above Hitler's fingernail. And if the critter was another primate I'd certainly rank its death worse.
A2: I'll flip the question, since my A1 wasn't what you expected:
I'm not sure on the exact number of cat deaths that, for me, would become worse than pulling the fingernail off a human. But probably closer to 100 than to 10 or 1k.
Within the context of your hypothetical AI: note that the cat is still orders of magnitude closer to us than the AI, even if the later would be more intelligent.
Thanks for taking the intuitive to flip the question.
The next question is: what metric are you using to determine that 100 cat deaths is roughly equivalent to one person having a fingernail pulled out? Why 100? Why not a million?
Do you think there is an objective formula to determine how much suffering is produced by?
I'm not following any objective formula, nor aware of one. (I would, if I could.) I'm trying to "gauge" it by subjective impact instead.
[I know that this answer is extremely unsatisfactory and I apologise for it.]
Black Mirror did a couple of episodes that's basically that: Black Museum, USS Callister, and San Junipero (but in a good way).
Someone downvoted the question, so the poster has struck a nerve.