this post was submitted on 31 Jan 2024
255 points (93.5% liked)

News

22903 readers
3756 users here now

Welcome to the News community!

Rules:

1. Be civil


Attack the argument, not the person. No racism/sexism/bigotry. Good faith argumentation only. This includes accusing another user of being a bot or paid actor. Trolling is uncivil and is grounds for removal and/or a community ban.


2. All posts should contain a source (url) that is as reliable and unbiased as possible and must only contain one link.


Obvious right or left wing sources will be removed at the mods discretion. We have an actively updated blocklist, which you can see here: https://lemmy.world/post/2246130 if you feel like any website is missing, contact the mods. Supporting links can be added in comments or posted seperately but not to the post body.


3. No bots, spam or self-promotion.


Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Post titles should be the same as the article used as source.


Posts which titles don’t match the source won’t be removed, but the autoMod will notify you, and if your title misrepresents the original article, the post will be deleted. If the site changed their headline, the bot might still contact you, just ignore it, we won’t delete your post.


5. Only recent news is allowed.


Posts must be news from the most recent 30 days.


6. All posts must be news articles.


No opinion pieces, Listicles, editorials or celebrity gossip is allowed. All posts will be judged on a case-by-case basis.


7. No duplicate posts.


If a source you used was already posted by someone else, the autoMod will leave a message. Please remove your post if the autoMod is correct. If the post that matches your post is very old, we refer you to rule 5.


8. Misinformation is prohibited.


Misinformation / propaganda is strictly prohibited. Any comment or post containing or linking to misinformation will be removed. If you feel that your post has been removed in error, credible sources must be provided.


9. No link shorteners.


The auto mod will contact you if a link shortener is detected, please delete your post if they are right.


10. Don't copy entire article in your post body


For copyright reasons, you are not allowed to copy an entire article into your post body. This is an instance wide rule, that is strictly enforced in this community.

founded 1 year ago
MODERATORS
 

A bipartisan group of US senators introduced a bill Tuesday that would criminalize the spread of nonconsensual, sexualized images generated by artificial intelligence. The measure comes in direct response to the proliferation of pornographic AI-made images of Taylor Swift on X, formerly Twitter, in recent days.

The measure would allow victims depicted in nude or sexually explicit “digital forgeries” to seek a civil penalty against “individuals who produced or possessed the forgery with intent to distribute it” or anyone who received the material knowing it was not made with consent. Dick Durbin, the US Senate majority whip, and senators Lindsey Graham, Amy Klobuchar and Josh Hawley are behind the bill, known as the Disrupt Explicit Forged Images and Non-Consensual Edits Act of 2024, or the “Defiance Act.”

Archive

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 11 points 7 months ago (5 children)

I don't get it. Why care? It's not her.

Maybe if they're making money of off her likeness. But without a money trail it just seems like chasing ghosts for not much reason.

[–] [email protected] 3 points 7 months ago

Its like having your nudes leaked but you never sent any, pretty fucked

[–] [email protected] 3 points 7 months ago (1 children)

There is a money trail when it's legal. You get blatant advertising of services where you pay to upload your own photos to make deepfakes with them, on all kinds of sites (ahem, Pornhub). That's a level of access that can't be ignored, especially if it's a US-based company providing the service, taking payment via Visa/Master etc. Relegate it to the underground where it belongs.

[–] [email protected] 1 points 7 months ago (1 children)

I'd be more okay if the law were profit based, because that's much easier to enforce.

I don't like laws that are near impossible to enforce unless they're absolutely necessary. I don't think this one is absolutely necessary.

[–] [email protected] 1 points 7 months ago* (last edited 7 months ago)

I don't think general enforcement against deepfake porn consumption is a practical application of this proposed law in civil court. Practical applications are shutting down US-based deepfake porn sites and advertising. As far as possessors go, consider cases of non-celebrities being deepfaked by their IRL acquaintances. In a scenario where the victim is aware of the deepfake such that they're able to bring the matter of possession to court, don't you agree it's tantamount to sexual harrassment? All I'm seeing there is the law catching up to cover disruptive tech with established legal principle

[–] [email protected] 3 points 7 months ago (2 children)

Because it’s her image?

I’d be fucking furious if someone was sharing say a fake photo of me fucking a watermelon. Doesn’t matter if it’s physically me or not, people would think it was.

[–] [email protected] 4 points 7 months ago

I hear the kids these days call it juicing.

[–] [email protected] 5 points 7 months ago

Would they though? I'd argue nobody thinks those were pictures of Taylor Swift. I'd go further and say that it helps in the sense that you can always deny even real pictures arguing they were AI.

[–] [email protected] 24 points 7 months ago

If you are interested to know you can search interviews with people who have been deepfaked in a sexual way where they explain how they feel and why they care.

[–] [email protected] 20 points 7 months ago (3 children)

Because it's gross, and they do it to minors now. and all they need are pictures of your kids from your social media profile. They even use AI to undress them.

[–] [email protected] 3 points 7 months ago (2 children)

And if I feel that cooking carrots is gross and cooked carrots shouldn't be fed to minors or miners? Should that be illegal as well?

[–] [email protected] 1 points 7 months ago (1 children)

If they’re ripped off of someone else’s farm, yes.

[–] [email protected] 1 points 7 months ago

Stolen carrots that are raw still cool though!

[–] [email protected] 5 points 7 months ago

Illegal should be things that harm individuals or society. Since the understanding of what is harmful or not might rather differ, we have to come up with compromises or conses on what actually becomes illegal.

[–] [email protected] 16 points 7 months ago (3 children)

And here we have the real answer: prudism. "It's gross". And of course "think of the children". You don't have a real answer, you have fear mongering

[–] [email protected] 0 points 7 months ago (1 children)

Not at all. Think of the consequences of someone’s nudes were leaked or an onlyfans account was made with images of them, and an employer sees it. They’re already firing teachers for being on there. And a lot of times they’re used in extortion. Not to mention your image is your property. It is you. And nobody else has rights to that.

[–] [email protected] 0 points 7 months ago (1 children)
[–] [email protected] 2 points 7 months ago

You don’t have to take nudes anymore to have nudes leaked. There are Ai that strip clothes from pictures. People have been making csam off of pictures of peoples kids on their Instagram profiles,etc.

[–] [email protected] 15 points 7 months ago (1 children)

I agree the issue is one of puritan attitudes toward sex and nudity. If no one gave a fuck about nude images, they wouldn't be humiliating, and if they weren't humiliating then the victim wouldn't really even be a victim.

However we live in the world we live in and people do find it embarrassing and humiliating to have nude images of themselves made public, even fakes, and I don't think it's right to tell them they can't feel that way.

They shouldn't ever have been made to feel their bodies are something to be embarrassed about, but they have been and it can't be undone with wishful thinking. Societal change must come first. But that complication aside, I agree with you completely.

[–] [email protected] 7 points 7 months ago (1 children)

Even without being puritan, there are just different levels of intimacy we are willing to share with different social circles - which might be different for everyone. It's fundamental to our happiness (in my opinion) to be able to decide for ourselves what we share with whom.

[–] [email protected] 2 points 7 months ago (1 children)

In this case I don't feel fake images are intimate at all, but I don't disagree with your point.

[–] [email protected] 5 points 7 months ago (1 children)

You might not, but others do. People have rather different thresholds when it comes to what they consider intimate. I recommend to just listen to interviews with victims and it becomes clear that to them the whole thins is very intimate and disturbing.

[–] [email protected] 5 points 7 months ago (1 children)

And I said their feelings are valid and should be respected regardless of how I might feel about them. I'm not sure if you are looking for something more from me here. Despite my personal feelings that nudity shouldn't be a source of shame, the fact is that allowing nudity to be used to hurt folks on the premise that nudity is shameful is something I utterly oppose. Like, I don't think you should be ashamed if someone has a picture of you naked, but the real enemy is the person saying, "haha! I have pictures of you naked!!!" Whether the pictures are AI, or photoshopped, or painted on a canvas, or even real photographs.

[–] [email protected] 4 points 7 months ago* (last edited 7 months ago) (1 children)

I see, it's seems that I misunderstood you. Now that I get your point, I would rather agree.

[–] [email protected] 1 points 7 months ago

In your defense, ending my earlier post with "I agree with you completely" is probably incongruous with my actual feelings given the post I was replying to. I have to heavily edit my posts to keep from rambling and sometimes the thread gets lost in my head by the time I actually hit the post button.

[–] [email protected] 7 points 7 months ago (2 children)

So you would not mind if I send AI sex videos of you to your parents and friends? How about a video where you are sexually degraded playing in public space - how would you feel about that? Maybe you performing sexual acts that you find gross yourself? You just need a bit of empathy to understand that not everyone is into exhibitionism and wants intimate things become public.

[–] [email protected] 10 points 7 months ago (1 children)

I'd really prefer that people not send my parents any kind of porn.

I look at it like someone took my face out of a Facebook picture, printed it, cut it out, pasted it over some porn, and did the same thing.

It'd be a weird thing for them to do, but I don't really need to send the law after them for it. Maybe for harassment?

Laws have a cost, even good intentioned laws. I don't believe we need new ones for this.

[–] [email protected] 1 points 7 months ago (2 children)

Do you think people might change their opinion on you and act differently after seeing you performing in porn?

Laws have a cost, even good intentioned laws.

It causes distress to victims, arguably violates personal rights and is moral and ethically at least questionable. What would be downsides of criminal persecution for non-consensual sexual Deepfakes?

[–] [email protected] 4 points 7 months ago (1 children)

Yeah, but it's happening to women mostly so these commenters probably don't really care.

[–] [email protected] 2 points 7 months ago

I think a lot of man have unfortunately difficulties to empathize with women here, because they have rather different experience when it comes to expressing their sexuality and possible negative consequences.

[–] [email protected] 4 points 7 months ago (1 children)

If they understand that this kind of porn exists? No.

But that's an education thing, not a legal thing.

The downside is giving law enforcement yet another excuse to violate digital privacy. Laws that are difficult/impossible to enforce tend to do more harm than good.

I don't see this law removing any fake Taylor Swift porn from the Internet. Or really any other celebrity, for that matter.

[–] [email protected] -3 points 7 months ago* (last edited 7 months ago)

If they understand that this kind of porn exists? No.

You know people form opinions on actors based on their roles in movies? So people will change what they think of you and how they act towards you based on media, even if it's clearly fictional.

The downside is giving law enforcement yet another excuse to violate digital privacy. Laws that are difficult/impossible to enforce tend to do more harm than good.

How exactly? Which new abilities to violate digital privacy is given the state by the this bill?

[–] [email protected] 5 points 7 months ago (1 children)

"So you would not mind if I send AI sex videos of you to your parents and friends?". Seems like sending it would be the dick move. My family and friends probably have no interest in seeing deepfakes of me naked.

"How about a video where you are sexually degraded playing in public space - how would you feel about that?" Considering its not really me.. meh. I don't personally care. Because it's not me.

"Maybe you performing sexual acts that you find gross yourself?" If someone wants to make deepfakes of me eating poop or something for whatever reason.. oh well? It's not really me.

But you do you.

[–] [email protected] -2 points 7 months ago

My family and friends probably have no interest in seeing deepfakes of me naked.

It mostly not saying that it's deepfakes of you. It's just a whats app message from someone who does not like you and you have to explain a whole new technology to your parents.

Considering its not really me… meh. I don’t personally care. Because it’s not me.

You know it, others don't. This will greatly change others perception of you and how they treat you.

It’s not really me.

Your boss and coworkers don't know.

But you do you.

No, but I have empathy with other people.

[–] [email protected] 17 points 7 months ago (1 children)

Generating sexual images of minors is already illegal. And these images can be generated by anyone modestly technical on their computer, so you can't go after people for creating or posessing the images (except if they look too young), only distribution.

This is unfortunately theater and will do basically nothing. How does a person even know if they are deep fakes? Or consensual? Hell what's too close of a likeness, because some of those images didn't look that much like her and at least one was not even realistic.

I'm not saying it's cool people are doing this, just that enforcement of this law is going to be a mess. You wind up with weird standards like how on Instagram you can show your labia but only through sheer material. Are deep fakes fine if you run them through an oil painting filter?

[–] [email protected] 2 points 7 months ago (1 children)

Are deep fakes fine if you run them through an oil painting filter?

Probably since nobody could mistake an oil painting for the real person, it's not a deep fake anymore.

[–] [email protected] 0 points 7 months ago (2 children)

I have about a 99% success rate at identifying AI full body images of people. People need to learn to look better. They look just as fake as the oil paintings.

[–] [email protected] 1 points 7 months ago (1 children)
[–] [email protected] 1 points 7 months ago

I think that's relevant when the defense against oil paintings is that you can tell they aren't real. The line can't be "you can't tell they are fake" because... well... you can identify AI artwork 99% of the time and the other 1% is basically when the pose is exactly so to conceal the telltale signs and the background is extremely simple so as to give nothing away.

[–] [email protected] 1 points 7 months ago

They look just as fake as the oil paintings.

You can go photo or even hyper realism with oil. And with AI you just need a bit of post.