Imagine reading that headline 20 years ago.
Gaming
Sub for any gaming related content!
Rules:
- 1: No spam or advertising. This basically means no linking to your own content on blogs, YouTube, Twitch, etc.
- 2: No bigotry or gatekeeping. This should be obvious, but neither of those things will be tolerated. This goes for linked content too; if the site has some heavy "anti-woke" energy, you probably shouldn't be posting it here.
- 3: No untagged game spoilers. If the game was recently released or not released at all yet, use the Spoiler tag (the little ⚠️ button) in the body text, and avoid typing spoilers in the title. It should also be avoided to openly talk about major story spoilers, even in old games.
Eh I agree with the reasonable takes here. Nothing wrong with generating that sort of stuff until it starts resembling the likeliness of a real living person. Then I think it’s just creepy; especially if for some reason you are sharing it 💀
wow. Imagine burning out your expensive GPU for a fortnite skin.
What? Seems like porn generation is the new crypto mining.
I'd rather have a wealth of new porn around rather than thousands random Blockchains going around.
At least the porn will probably be useful for someone long term haha
This feels exploitative AF on multiple levels.
I’ll be a minority voice considering the other comments. But maybe just pay for onlyfans or whatever you guys use. I’m a generally attractive woman (I can surmise from interactions while trying to date) and I really don’t like the idea that my likeness would be used for something like this. Get your jollies off, but try and be a bit consensual about it. Is that so much to ask?
So I’m not disagreeing with you, but you’re assuming they’re making deepfake images, and the article doesn’t specify that. In fact I’d bet that it’s just AI generated “people” that don’t exist.
What about AI porn of a person that doesn’t exist?
However, one of Salad's clients is CivitAi, a platform for sharing AI generated images which has previously been investigated by 404 media. It found that the service hosts image generating AI models of specific people, whose image can then be combined with pornographic AI models to generate non-consensual sexual images.
It isn't too much to ask. According to Dr. K of HealthyGamerGG (Harvard Psychiatrist/Instructor), research shows that the release of non-consensual porn makes the unwilling subjects suicidal over half the time. Non-consensual porn = deepfakes, revenge porn, etc. It's seriously harmful, and there are other effects like depression, shame, PTSD, anxiety, and so on. There is functionally unlimited porn out there that is made with consent, and if someone doesn't want to be publicly sexually explicit then that's their choice.
I'm not against AI porn in general (I consider it the modern version of dirty drawings/cartoons), but when it comes to specific likenesses as with deepfakes then there's clear proof of harm and that's enough for me to oppose it. I don't believe there's some inherent right to see specific people naked against their will.