this post was submitted on 26 Jan 2024
86 points (89.1% liked)

Technology

59429 readers
2815 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Taylor Swift is living every woman’s AI porn nightmare — Deepfake nudes of the pop star are appearing all over social media. We all saw this coming.::Deepfake nudes of the pop star are appearing all over social media. We all saw this coming.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] -1 points 9 months ago* (last edited 9 months ago) (2 children)

So she’s less a victim because she’s wealthy? My god you people can justify anything, can’t you?

[–] [email protected] 2 points 9 months ago (1 children)

You just keep shifting your argument to create some sort of sympathy. I guess. No one says a rich person isn't a victim. The point is is being a victim as a wealthy and influential woman like Taylor is a lot different than being a victim in a working class context. If you disagree with that, then you're either being intellectually dishonest or living in a dream world.

Even the law agrees. It's a lot harder as a celebrity to win a defamation lawsuit than it is being a normal person. You typically have to show actual malice. Frankly, that's the legal standard that would probably apply to any lawsuit involving the deep fakes anyway.

[–] [email protected] -2 points 9 months ago (1 children)

The wealth of the victim doesn’t change the crime.

[–] [email protected] 0 points 9 months ago (1 children)
[–] [email protected] 1 points 9 months ago (1 children)

So, creating nude AI deepfakes isn’t a crime? Then there’s no victims at all. What’s everyone talking about then?

[–] [email protected] 1 points 9 months ago (1 children)

It can't be a crime unless there is a criminal statute that applies. See if you can find one thst applies.

[–] [email protected] 0 points 9 months ago (1 children)

So there’s no victims. Rich or poor. Why is this a problem?

[–] [email protected] 0 points 9 months ago

Your response doesn't logically respond to my comment. It attempts to reframe the argument by setting up a "strawman," and shows that you fail to understand (or choosing to ignore because it doesn't support your new reframed argument) the difference between civil and criminal law in the United States.

[–] [email protected] 2 points 9 months ago (1 children)

That is exactly it. She will suffer less compared to someone else this might have happened to, an dif you define victimhood on a spectrum, she's less victim than Housewife Community leader preschool teacher Margaret from Montana.

[–] [email protected] -1 points 9 months ago (1 children)

Gross dude. Very gross. Blocking you now as someone who thinks the wealthy can’t be victimized can’t possibly have anything of value to contribute.

Do better.

[–] [email protected] 1 points 9 months ago (1 children)

The guy said less victimized and you conclude he meant cannot be victimized. Can you be any more stupid?

[–] [email protected] -2 points 9 months ago (1 children)

Since you have nothing worthwhile to say, I’m going to go ahead and block your annoying ass.

[–] [email protected] 2 points 9 months ago

Lol, can't even bother to address such a simple point, so pathetic.