this post was submitted on 20 Apr 2024
487 points (98.4% liked)

Gaming

22367 readers
46 users here now

Sub for any gaming related content!

Rules:

founded 5 years ago
MODERATORS
(page 3) 15 comments
sorted by: hot top controversial new old
[–] [email protected] 172 points 1 year ago (4 children)

Imagine reading that headline 20 years ago.

load more comments (4 replies)
[–] [email protected] 33 points 1 year ago (1 children)

explain this to a person in 1998

load more comments (1 replies)
[–] [email protected] 1 points 1 year ago

Eh I agree with the reasonable takes here. Nothing wrong with generating that sort of stuff until it starts resembling the likeliness of a real living person. Then I think it’s just creepy; especially if for some reason you are sharing it 💀

[–] [email protected] 20 points 1 year ago

wow. Imagine burning out your expensive GPU for a fortnite skin.

[–] [email protected] 11 points 1 year ago (1 children)

What? Seems like porn generation is the new crypto mining.

[–] [email protected] 16 points 1 year ago

I'd rather have a wealth of new porn around rather than thousands random Blockchains going around.

At least the porn will probably be useful for someone long term haha

[–] [email protected] 49 points 1 year ago

This feels exploitative AF on multiple levels.

[–] [email protected] 84 points 1 year ago (28 children)

I’ll be a minority voice considering the other comments. But maybe just pay for onlyfans or whatever you guys use. I’m a generally attractive woman (I can surmise from interactions while trying to date) and I really don’t like the idea that my likeness would be used for something like this. Get your jollies off, but try and be a bit consensual about it. Is that so much to ask?

[–] [email protected] 12 points 1 year ago (1 children)

So I’m not disagreeing with you, but you’re assuming they’re making deepfake images, and the article doesn’t specify that. In fact I’d bet that it’s just AI generated “people” that don’t exist.

What about AI porn of a person that doesn’t exist?

[–] [email protected] 25 points 1 year ago (1 children)

However, one of Salad's clients is CivitAi, a platform for sharing AI generated images which has previously been investigated by 404 media. It found that the service hosts image generating AI models of specific people, whose image can then be combined with pornographic AI models to generate non-consensual sexual images.

[–] [email protected] 12 points 1 year ago (1 children)

Fair, somehow I missed that

load more comments (1 replies)
[–] [email protected] 67 points 1 year ago* (last edited 1 year ago) (3 children)

It isn't too much to ask. According to Dr. K of HealthyGamerGG (Harvard Psychiatrist/Instructor), research shows that the release of non-consensual porn makes the unwilling subjects suicidal over half the time. Non-consensual porn = deepfakes, revenge porn, etc. It's seriously harmful, and there are other effects like depression, shame, PTSD, anxiety, and so on. There is functionally unlimited porn out there that is made with consent, and if someone doesn't want to be publicly sexually explicit then that's their choice.

I'm not against AI porn in general (I consider it the modern version of dirty drawings/cartoons), but when it comes to specific likenesses as with deepfakes then there's clear proof of harm and that's enough for me to oppose it. I don't believe there's some inherent right to see specific people naked against their will.

load more comments (3 replies)
load more comments (26 replies)
[–] [email protected] 54 points 1 year ago (1 children)

Capitalism breeds innovation

[–] [email protected] 4 points 1 year ago* (last edited 1 year ago)
load more comments
view more: ‹ prev next ›