this post was submitted on 24 Apr 2024
285 points (95.5% liked)

Technology

70366 readers
3724 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
(page 2) 22 comments
sorted by: hot top controversial new old
[–] Buelldozer@lemmy.today 8 points 1 year ago
[–] peanuts4life@lemmy.blahaj.zone -2 points 1 year ago
[–] themeatbridge@lemmy.world 31 points 1 year ago (2 children)

No reason not to ban them entirely.

The problem is enforcing the ban. Would it be a crime to have access to the software, or would they need to catch the criminals with the images and video files? It would be trivial to host a site in a country without legal protections and make the software available from anywhere.

[–] mynamesnotrick@lemmy.zip 11 points 1 year ago (2 children)

I feel like a sensible realistic course of action with this is it needs to be in the act of sharing/distributing. It would be way to broad otherwise as the tools that generate this stuff have unlimited purposes. Obvious child situations should be dealt with in the act of production of but the enforcement mechanism needs to be on the sharing/distribution part. Unfortunately analogy is blame the person not the tool on this one.

[–] catloaf@lemm.ee 6 points 1 year ago

Right. And honestly, this should already be covered under existing harassment laws.

load more comments (1 replies)
[–] 520@kbin.social 21 points 1 year ago (3 children)

Would it be a crime to have access to the software, or would they need to catch the criminals with the images and video files?

Problem with the former is that would outlaw any self hosted image generator. Any image generator is capable of use for deep fake porn

load more comments (3 replies)
[–] WhyDoYouPersist@lemmy.world 11 points 1 year ago (2 children)

For some reason I thought it was mainly to protect Taylor Swift, with teen girls being the afterthought.

[–] Imgonnatrythis@sh.itjust.works 3 points 1 year ago (1 children)

Won't somebody please think of Taylor?!

load more comments (1 replies)
load more comments (1 replies)
[–] otp@sh.itjust.works 83 points 1 year ago* (last edited 1 year ago) (2 children)

The laws regarding a lot of this stuff seem to ignore that people under 18 can and will be sexual.

If we allow people to use this tech for adults (which we really shouldn't), then we have to accept that people will use the same tech on minors. It isn't even necessarily pedophilia on all cases (such as when the person making them is also a minor)*, but it's still something that very obviously shouldn't be happening.

* we don't need to get into semantics. I'm just saying it's not abnormal (the way pedophilia is) for a 15-year old to be attracted to another 15-year old in a sexual way.

Without checks in place, this technology will INEVITABLY be used to undress children. If the images are stored anywhere, then these companies will be storing/possessing child pornography.

The only way I can see to counteract this would be to invade the privacy of users (and victims) to the point where nobody using them """legitimately""" would want to use it...or to just ban them outright.

[–] micka190@lemmy.world 20 points 1 year ago (3 children)

such as when the person making them is also a minor

I get the point you're tying to make. But minors taking nudes of themselves is illegal in a lot of places, because it's still possession.

[–] BrianTheeBiscuiteer@lemmy.world 56 points 1 year ago (4 children)

And that's still a bit messed up. It's a felony for a teen to have nude pictures of themselves and they'll be registered sex offenders for life and probably ineligible for most professions. Seems like quite a gross over reaction. There needs to be a lot of reform in this area but no politician wants to look like a "friend" to pedophiles.

load more comments (4 replies)
[–] otp@sh.itjust.works 10 points 1 year ago (2 children)

I get the point you're tying to make. But minors taking nudes of themselves is illegal in a lot of places, because it's still possession.

I agree, and on the one hand, I understand why it could be good to consider it illegal (to prevent child porn from existing), but it does also seem silly to treat it as a case of pedophilia.

load more comments (2 replies)
[–] Zorque@kbin.social 22 points 1 year ago

Which is more of a "zero-tolerance" policy, like unto giving the same punishment to a student defending themselves as the one given to the person who initiated the attack.

[–] autotldr@lemmings.world -1 points 1 year ago

This is the best summary I could come up with:


Caroline Mullet, a ninth grader at Issaquah High School near Seattle, went to her first homecoming dance last fall, a James Bond-themed bash with blackjack tables attended by hundreds of girls dressed up in party frocks.

Since early last year, at least two dozen states have introduced bills to combat A.I.-generated sexually explicit images — known as deepfakes — of people under 18, according to data compiled by the National Center for Missing & Exploited Children, a nonprofit organization.

nudification apps is enabling the mass production and distribution of false, graphic images that can potentially circulate online for a lifetime, threatening girls’ mental health, reputations and physical safety.

A lawyer defending a male high school student in a deepfake lawsuit in New Jersey recently argued that the court should not temporarily restrain his client, who had created nude A.I.

Under the new Louisiana law, any person who knowingly creates, distributes, promotes or sells sexually explicit deepfakes of minors can face a minimum prison sentence of five to 10 years.

After learning of the incident at Issaquah High from his daughter, Senator Mullet reached out to Representative Orwall, an advocate for sexual assault survivors and a former social worker.


The original article contains 1,288 words, the summary contains 198 words. Saved 85%. I'm a bot and I'm open source!

load more comments