this post was submitted on 03 Mar 2024
200 points (86.5% liked)

Technology

60462 readers
3886 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
 

A.I. Is Making the Sexual Exploitation of Girls Even Worse::Parents, schools and our laws need to catch up to technology, fast.

(page 2) 40 comments
sorted by: hot top controversial new old
[–] [email protected] 23 points 10 months ago
[–] [email protected] 4 points 10 months ago (1 children)

A very ugly side to the technology, I absolutely think this should be considered on the same level as revenge porn and child pornography.

I also fear these kind of stories while be used to manipulate the public into thinking banning the tools is in their best interest instead of punishing the bad actors.

[–] [email protected] 8 points 10 months ago

Your second point here is exactly what we should fear; that it might becomes legal only for companies and governments to use AI but not the masses.

That's why I'm of the opinion that we should maybe just get over it. It's going to continue to become easier and easier to use it for horny reasons. The guy wearing the smart glasses might be seeing every women around him undressed in real time. We're just a few years away from that and there is no acceptable way to prevent that.

[–] [email protected] 39 points 10 months ago (1 children)

Paywall. Nope. Exploit those girls with your AI then. The messages against it can't be heard without money.

[–] [email protected] 1 points 10 months ago (3 children)

Its bad for sexual exploitation of any gender,,

[–] [email protected] 6 points 10 months ago

Totally agree! It's unfortunate that we live in a sexist ass society that would down vote a post like this! Sexual exploitation is bad regardless of gender and there should be nothing controversial about that!

[–] [email protected] 20 points 10 months ago (3 children)

Are... both genders currently being exploited equally?

[–] [email protected] -2 points 10 months ago (1 children)
[–] [email protected] 11 points 10 months ago (4 children)

That's the "all lives matter" argument, and it's disingenuous.

If billionaires are being forced to pay unfair yacht fees, that's a problem. But that doesn't mean it's as immediate and important an issue as starving children, and obviously shouldn't be prioritized the same. That's just an example, but "major in the majors", as my mom taught me...

[–] [email protected] 4 points 10 months ago* (last edited 10 months ago) (1 children)

Ignoring the fact that it does happen is the same as ignoring mental health for men, or when Google does this shit..

Starting to lose more and more respect for Lemmy's userbase as time goes on. I don’t care that you think it’s “disingenuous”, no one should be getting exploited by AI. I don’t give a shit what gender they are. If you think I’m being disingenuine to avoid the issue itself, then oh well, there’s no discussion to be had..

[–] [email protected] 4 points 10 months ago (1 children)

In discussions of this issue I've come to the conclusion that a not-small portion of those participating in said discussion would probably be doing the exploiting. I guess I'm just too old (as in, over 25) or too "normal" for Lemmy.

load more comments (1 replies)
load more comments (3 replies)
load more comments (2 replies)
[–] [email protected] 5 points 10 months ago (1 children)
load more comments (1 replies)
[–] [email protected] 81 points 10 months ago (2 children)

This is a human problem, not an AI problem.

Maybe if we hadn't neglected it for the past century.....

[–] [email protected] 24 points 10 months ago (4 children)

Ai is definitely making things worse. When I was at school there was no tool for creating a deep fake of girls, now boys sneek a pic and use an app to undress them. That then gets shared, girls find out and obviously become distressed. Without ai boys would have to either sneak into toilets/changing rooms or psychically remove girls clothes. Neither of which was happening at such a large degree in a school before as it would create a shit show.

Also most jurisdictions don't actually have strict AI laws yet which is making it harder for authorities to deal with. If you genuinely believe that AI isn't at fault here then you're ignorant of what's happening around the world.

https://www.theguardian.com/technology/2024/feb/29/clothoff-deepfake-ai-pornography-app-names-linked-revealed That's an article about one company that provides an app for deep fakes. It's a shell corp so not easy to shut down through the law and arrest people, also hundreds of teenage girls have been affected by others creating non consensual nudes of them.

[–] [email protected] 8 points 10 months ago* (last edited 10 months ago) (2 children)

Photoshop has existed for quite some time. Take photo, google naked body, paste face on body. The ai powered bit just makes it slightly easier. I don't want a future where my device is locked down and surveiled to the point I can't install what I want on it. Neither should the common man be excluded from taking advantage of these tools. This is a people problem. Maybe culture needs to change. Limit phone use in schools. Technical solutions will likely only bring worse problems. There are probably no lazy solutions here. This is not one of those problems you can just hand over to some company and tell them to figure it out.

Though I could get behind making it illegal to upload and store someone's likeness unless explicit consent was given. That is long overdue. Though some big companies would not get behind that. So it would be a hard sell. In fact, I would like all personal data be illegal to store, trade and sell.

[–] [email protected] 3 points 10 months ago

Though I could get behind making it illegal to upload and store someone’s likeness unless explicit consent was given. That is long overdue. Though some big companies would not get behind that.

But many big companies would love it. Basically, it turns a likeness into intellectual property. Someone who pirates a movie, would also be pirating likenesses. The copyright industry would love it; the internet industry not so much,

Licensing their likeness - giving consent after receiving money, if you prefer - would also be a new income stream for celebrities. They could license their likeness for any movie or show and get a pretty penny without having to even show up. They would just be deep-faked onto some skilled, low-paid double.

[–] [email protected] 6 points 10 months ago

Photoshop has existed for quite some time. Take photo, google naked body, paste face on body. The ai powered bit just makes it slightly easier.

Slightly easier? That's a one hell of an understatement. Have you ever used Stable Diffusion?

[–] [email protected] 36 points 10 months ago

When I was a kid I used to draw dirty pictures and beat off to them. AI image creation is a paint brush.

I very much disagree with using it to make convincing deepfakes of real people, but I struggle with laws restricting its use otherwise. Are images of ALL crimes illegal, or just the ones people dislike? Murder? I’d call that the worst crime, but we sure do love murder images.

[–] [email protected] 20 points 10 months ago (1 children)

Ai is definitely making things worse. When I was at school there was no tool for creating a deep fake of girls, now boys sneek a pic and use an app to undress them. That then gets shared, girls find out and obviously become distressed. Without ai boys would have to either sneak into toilets/changing rooms or psychically remove girls clothes.

I'm sorry but this is bullshit. You could "photoshop" someone's face / head onto someone else's body already before "AI" was a thing. Here's a tutorial that allows you to do this within minutes, seconds if you know what you're doing: https://www.photopea.com/tuts/swap-faces-online/

That's an article about one company that provides an app for deep fakes. It's a shell corp so not easy to shut down through the law and arrest people, also hundreds of teenage girls have been affected by others creating non consensual nudes of them.

Also very ignorant take. You can download Stable Diffusion for free and add a face swapper to that too. Generating decent looking bodies actually might take you longer than just taking a nude photo of someone and using my previous editing method though.

[–] [email protected] 8 points 10 months ago (4 children)

You could do everything before, that's true, but you needed knowledge/time/effort, so the phenomenon was very limited. Now that it's easy, the number of victims (if we can call them that) is huge. And that changes things. It's always been wrong. Now it's also a problem

load more comments (4 replies)
load more comments (1 replies)
[–] [email protected] 27 points 10 months ago

It’s both.

[–] [email protected] 127 points 10 months ago (2 children)

But they're banning printed books in libraries instead...

[–] [email protected] 66 points 10 months ago (1 children)

I've always taken the "protect our children" argument to have an implied "...from the knowledge that will actually protect them from abusers" based on the things the argument is trotted out for.

[–] [email protected] 13 points 10 months ago

always be wary when they say either "protect the children" or " for your safety"

[–] [email protected] 31 points 10 months ago

Can't exploit kids as easily when they know what exploitation is.

[–] [email protected] 9 points 10 months ago

This is the best summary I could come up with:


But the idea of such young children being dehumanized by their classmates, humiliated and sexualized in one of the places they’re supposed to feel safe, and knowing those images could be indelible and worldwide, turned my stomach.

And while I still think the subject is complicated, and that the research doesn’t always conclude that there are unfavorable mental health effects of social media use on all groups of young people, the increasing reach of artificial intelligence adds a new wrinkle that has the potential to cause all sorts of damage.

So I called Devorah Heitner, the author of “Growing Up in Public: Coming of Age in a Digital World,” to help me step back a bit from my punitive fury.

In the Beverly Hills case, according to NBC News, not only were middle schoolers sexualizing their peers without consent by creating the fakes, they shared the images, which can only compound the pain.

(It should be noted that in the Beverly Hills case, according to NBC News, the superintendent of schools said that the students responsible could face suspension to expulsion, depending on how involved they were in creating and sharing the images.)

I regularly hear from people who say they’re perplexed that young women still feel so disempowered, given the fact that they’re earning the majority of college degrees and doing better than their male counterparts by several metrics.


The original article contains 1,135 words, the summary contains 230 words. Saved 80%. I'm a bot and I'm open source!

load more comments
view more: ‹ prev next ›