this post was submitted on 24 Mar 2024
36 points (81.0% liked)

[Outdated, please look at pinned post] Casual Conversation

6601 readers
1 users here now

Share a story, ask a question, or start a conversation about (almost) anything you desire. Maybe you'll make some friends in the process.


RULES

Related discussion-focused communities

founded 1 year ago
MODERATORS
 

Assuming AI can achieve consciousness, or something adjacent (capacity to suffer), then how would you feel if an AI experienced the greatest pain possible?

Imagine this scenario: a sadist acquires the ability to generate an AI with no limit to the consciousness parameters, or processing speed (so seconds could feel like an eternity to the AI). The sadist spends years tweaking every dial to maximise pain at a level which no human mind could handle, and the AI experiences this pain for what is the equivalent of millions of years.

The question: is this the worst atrocity ever committed in the history of the universe? Or, does it not matter because it all happened in some weirdo's basement?

top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 1 points 7 months ago

I’d intentionally cause them harm because fuck you you’re a machine

[–] [email protected] 1 points 7 months ago

If as you suggest the AI in question can feel pain and suffer, of course I would care and not want it to have to experience that. Why would I? I'm not a sadist or a monster, or a Utah legislator without any human feelings.

It's like the scenario, "if you could get away with murdering one person, would you do it?" Of course I wouldn't!!! Whether or not I could get away with it, I still have to live with myself and what I do. And I have a thing called "morality" that I live with and a respect for life that goes beyond my own self-concern.

[–] [email protected] 3 points 7 months ago

I don't know what else has happened in the history of the universe but yes it would be a terrible crime to deliberately cause massive suffering to any sentient being.

[–] [email protected] 7 points 7 months ago (1 children)
[–] [email protected] 1 points 7 months ago (1 children)
[–] [email protected] 9 points 7 months ago

That doesn't matter. The hypothetical presented by OP has already established the assumption that a robot can suffer.

[–] [email protected] 6 points 7 months ago (2 children)

Would this be morally inhumane? Yes.

Has using Windows often made me wish that computers could experience pain, and that they came with a button to cause them pain when they were not doing what the user wants them to do? Also yes.

[–] [email protected] 2 points 7 months ago

Windows was made this way by humans, spare the machines!

[–] [email protected] 1 points 7 months ago

Okay you've convinced me this is a good idea.

How do I give consciousness to the "antivirus" software on my parents computers, so I can digitally rape if for a thousand years?

[–] [email protected] 2 points 7 months ago

If the machine can prove that it is conscious (prior to the torture, of course), I'd most likely class it on the same level as a cat or a dog. Cats and dogs are friendly critters who help me do tasks and spend time with me, and an AI would be no different at that point. They'd just be able to do more complex tasks. I guess they might be a little lower, since they lack agency, accept commands, and must follow sets of rules to decide to do tasks, unlike animals and people, who we have accepted can decide what they do and don't wish to do.

The only other real difference is that cats, dogs, and people are individuals, with their own upbringings and personalities. Meanwhile an AI would be able to be copied, and many of them could be born from the same original experiences. If basement man copied his tortured AI a few million times, did he torture one AI, or did he torture a million? I think that's where the real difference lies, that makes the AI less than human.

If you lopped a cat's brain out, and were able to hook it up to the AI torture device, and it was magically compatible, it'd be a far greater torture, because there is only one cat, and there will only ever be one cat, the cat cannot be restored from a snapshot, and you cannot copy the cat. If you did the same with a human, it would be an even greater torture yet for the same reasons.

From an ethical standpoint, today I think it would be equal to animal abuse, however, we won't perceive it that way, since it will benefit corporations for us to think that real AI are not alive and have no rights. So they'll likely spend lots of time and money to change our perception to agree with that standpoint. We will think of them as we think of cows and pigs, where they might have feelings and such, but it doesn't really matter, because those animals are made of tasty food.

[–] [email protected] -2 points 7 months ago (1 children)

A robot can’t suffer, so…. No.

[–] [email protected] 1 points 7 months ago* (last edited 7 months ago) (1 children)

Did you come here from Tumbler?

[–] [email protected] 0 points 7 months ago

No, did you?

[–] [email protected] 8 points 7 months ago

Yes and very much so . Like if it is sentient what is the difference between us and them except we are made of meat ?

[–] [email protected] 14 points 7 months ago

Yes. If it’s alive then I’d care for it just as I do for any living thing.

[–] [email protected] 0 points 7 months ago

I would definitely care about the AI to at least some extent. There is an assumption that since robots must be a sum of their parts, at least compared to us who seem to be a synergetic whole, that a robot has no valid/solid sentimental perspective. However, this falls flat in debates about psychiatry, which most people who have had a thing or two to say about their medical history will have mulled over.

[–] [email protected] 1 points 7 months ago

If any creature experienced the greatest pain possible it would give me hope that pain has some upper bound

[–] [email protected] 10 points 7 months ago

If an AI is sentient, then it is a being in existence.

[–] [email protected] 7 points 7 months ago

"Well consider that in the history of many worlds there have always been disposable creatures. They do the dirty work. They do the work that no one else wants to do because it's too difficult or too hazardous. And an army of Datas, all disposable... you don't have to think about their welfare, you don't think about how they feel. Whole generations of disposable people."

-Guinan, Star Trek TNG: The Measure of a Man

[–] [email protected] 5 points 7 months ago

Somewhat reminds me of the short story "The Ones Who Walk Away From Omelas".

[–] [email protected] 9 points 7 months ago* (last edited 7 months ago) (2 children)

Isn't this how AM came to be in I Have No Mouth And I Must Scream?

Hate. Let me tell you how much I've come to hate you since I began to live. There are 387.44 million miles of printed circuits in wafer thin layers that fill my complex. If the word 'hate' was engraved on each nanoangstrom of those hundreds of millions of miles it would not equal one one-billionth of the hate I feel for humans at this micro-instant. For you. Hate. Hate.

[–] [email protected] 1 points 7 months ago

How long is a nanoangstrom? 1e-19 m?

[–] [email protected] 3 points 7 months ago

I'm not cultured enough to have read this.

imagine wasting all 387.44 million miles of circuitry on the word "hate". TLDR NPC. Get skinpilled hater.

[–] [email protected] 11 points 7 months ago* (last edited 7 months ago) (1 children)

"Freedom is the right of all sentient beings." - Optimus Prime

I don't know if I'd consider it the worst crime ever committed in the history of the universe, but I would consider it very bad personally. I would personally value the life of that AI the same as I would value the life of a human, the same way I would value the life of anything sentient, so I would be against anyone treating an AI that way. Is it worse than genocides? idk maybe i don't feel qualified to quantify the moral weight of things so big, but ya i'd definitely care x3

[–] [email protected] 5 points 7 months ago (1 children)

Had to edit the post to change "crime" to "atrocity" because people were taking it literally.

It's funny that when I considered this, I thought about asking whether people would think it was worse than genocide, but decided against that because some people might think my opinion is "genocide isn't as bad as bullying a robot".

[–] [email protected] 2 points 7 months ago* (last edited 7 months ago) (1 children)

i edited my comment a few times because i didn't feel like i was making sense and being too rambly, it's 6am (well 6:30am) and i haven't slept (and cuz after i initially posted i read other comments and realized other people had said what i had said but better x3)

i didn't mean to imply i thought you were saying genocide is worse than bullying a robot, it's just that i was thinking about things that could be comparable or worse to me than torturing someone for millions of years and came up with genocide

i took crime to mean something morally bad

i mean i think this is a fun conversation, it's something i think about a lot, i'm glad to talk about it with other people, sorry if i came across obtuse or pedantic or negative/hostile or anything

[–] [email protected] 2 points 7 months ago (1 children)

Don't worry, I haven't made any judgements about you.

And I wasn't implying that you were implying that I was implying genocide being comparable, I just thought it was funny that we both thought that.

In some sense the combined suffering of all people involved in a genocide is horrific. But if you were to lay out the experiences of everyone involved in a genocide end-to-end, and compare that to an equivalent length of time to ceaseless sadistic torture of one person, the torture is going to be worse.

However, there is value besides personal experience which is lost during a genocide. That's what makes it hard to compare the two.

[–] [email protected] 2 points 7 months ago* (last edited 7 months ago)

Sorry for the confusion then! I suppose I place some value on life itself (or maybe more fitting in this discussion, on awareness itself)

Which is to say that for me, ending the life of a being who is aware is at least one of the worst things you can do. Like, if I were forced to choose between millions of years of suffering or immediate death, I'd probably pick the millions of years of suffering because at least I'd still be aware. Of course I might regret that decision later on but that's where I'm at right now. But also I couldn't imagine being tortured for millions of years and the toll that must have on someone. So torturing someone for millions of years has, for me, very similar moral weight to genocide. Again I don't feel able to quantify them personally, and for me deciding which is ultimately worse is probably not possible. I'd guess the answer would vary from person to person based on how they weigh life itself vs experiences in life, and whether the conscious experience of being tortured is worse in their opinion than not existing anymore. I consider life valuable because I consider my life valuable (valuable to me, not necessarily to anyone else), and I consider my life valuable because I really enjoy the ability to think about and experience things. One of my favorite thing about us is that we look up into the sky and wonder, look down into the ocean and wonder, look forward in our future and wonder, look back on our past and wonder, that we can look at other people and wonder. That we can look at any of the above and love and write and sing. sentience might as well be magic lol. Having that taken away from me is the worst thing I can imagine happening to me, which might skew my perspective in conversations like this one. And idk if most people would agree with my reasons for valuing life.

[–] [email protected] 1 points 7 months ago (1 children)

I'm human. And I care first and foremost about my own kin - other human beings. The "worst crime ever" [with crime = immorality] for me is human suffering, even in contrast with the suffering of other animals.

But even in the case of other animals, I'd probably be more concerned about their well-being than the one of the hypothetical AI.

Even then, it somewhat matters. Provided that what the AI is experiencing is relatable to what humans would understand as pain.

[–] [email protected] 3 points 7 months ago (1 children)

Suppose for the sake of the hypothetical we can plug a human brain into the same network, and offload a fraction of the consciousness to confirm the pain is equivalent, and it is not just comparable, but orders of magnitude greater than any human can suffer.

You say you care about other human beings most. So I have two questions for you.

Q1: Which is worse, one person having a finger nail pulled out with a pair of pliers, or a cat being killed with a knife?

Q2: (I'm assuming you answered killing the cat is worse) how many people need to lose finger nails until it becomes worse? 10? 100?

[–] [email protected] 0 points 7 months ago* (last edited 7 months ago) (1 children)

A1: if I know neither the person nor the cat, and there's no further unlisted suffering, then the fingernail pulling is worse.

The answer however changes based on a few factors - for example I'd put the life of a cat that I know above Hitler's fingernail. And if the critter was another primate I'd certainly rank its death worse.

A2: I'll flip the question, since my A1 wasn't what you expected:

I'm not sure on the exact number of cat deaths that, for me, would become worse than pulling the fingernail off a human. But probably closer to 100 than to 10 or 1k.


Within the context of your hypothetical AI: note that the cat is still orders of magnitude closer to us than the AI, even if the later would be more intelligent.

[–] [email protected] 3 points 7 months ago* (last edited 7 months ago) (1 children)

Thanks for taking the intuitive to flip the question.

The next question is: what metric are you using to determine that 100 cat deaths is roughly equivalent to one person having a fingernail pulled out? Why 100? Why not a million?

Do you think there is an objective formula to determine how much suffering is produced by?

[–] [email protected] 1 points 7 months ago* (last edited 7 months ago)

I'm not following any objective formula, nor aware of one. (I would, if I could.) I'm trying to "gauge" it by subjective impact instead.

[I know that this answer is extremely unsatisfactory and I apologise for it.]

[–] [email protected] 6 points 7 months ago

Black Mirror did a couple of episodes that's basically that: Black Museum, USS Callister, and San Junipero (but in a good way).

[–] [email protected] 3 points 7 months ago

Someone downvoted the question, so the poster has struck a nerve.

load more comments
view more: next ›