this post was submitted on 02 Jul 2025
379 points (97.5% liked)

Technology

72484 readers
3350 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Schools and lawmakers are grappling with how to address a new form of peer-on-peer image-based sexual abuse that disproportionately targets girls.

top 50 comments
sorted by: hot top controversial new old
[–] [email protected] -2 points 4 days ago

Burkas for the win ?

[–] [email protected] 12 points 4 days ago (1 children)
[–] [email protected] 5 points 4 days ago

Can't afford this much cheese today to find just the right slice for every bikini photo...

[–] [email protected] 5 points 4 days ago (1 children)

Deepfakes might end up being the modern version of a bikini. In the olden days, people wore these to the beach. Having less was scandalous and moral decay. Yet, now we wear much less.

Our grandchildren might simply not give a damn about their nudity, because it is assumed that everyone is deepfaking everyone.

[–] [email protected] 11 points 4 days ago (2 children)

These are all worn voluntarily. This issue isn’t about the equivalent of scandalously clad young girls, it’s like if girls were being involuntarily stripped of their clothing by their classmates. It’s not about modesty standards it’s about sexual abuse.

[–] [email protected] -1 points 4 days ago

Unless it is used to pretend that it is a real video and circulated for denigration or blackmail, it is very much not at all like assault. And also, deepfakes do not have the special features hidden under your clothes, so it is possible to debunk those if you really have to.

[–] [email protected] 3 points 4 days ago

It can be both. The cornerstone of why nudity can be abused, is that society makes it shameful to be bare. If some generations from now that people can just shrug and not care, that is one less tool an abuser can use against people.

In any case, I am of the mind that people of my generation might be doing their own version of the Satanic Panic, or the reaction against rap music. For better or worse, older people cannot relate to the younger.

[–] [email protected] -4 points 4 days ago (1 children)

anyone using any kind of AI either doesn't know how consent works-- or they don't care about it.

a horrifying development in the intersection of technofascism and rape culture

[–] [email protected] 2 points 4 days ago (1 children)

Any AI? Every application? What kind of statement is this?

[–] [email protected] 2 points 4 days ago (1 children)

AI models (unless you're training your own) are usually trained on data it does not have a licence to use. The companies training these models are also notorious for ignoring robot.txt and other measures websites use to stop bots from trawling their data.

Like in crypto, most people in AI are not nerds, just criminal scum.

[–] [email protected] 1 points 4 days ago (1 children)

You are thinking of LLMs, not AI in general.

[–] [email protected] 1 points 4 days ago

I am. And so is OC. Neural networks are a different beast, although neither is actual AI. Just a marketing term at this point.

[–] [email protected] 16 points 4 days ago (1 children)

Maybe let's assume all digital images are fake and go back to painting. Wait... what if children start painting deepfakes ?

[–] [email protected] 2 points 4 days ago* (last edited 4 days ago)

Or pasting someone's photo over porn...in their minds...

load more comments
view more: next ›