this post was submitted on 16 Feb 2024
166 points (94.6% liked)

Technology

59161 readers
2115 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

An in-depth police report obtained by 404 Media shows how a school, and then the police, investigated a wave of AI-powered “nudify” apps in a high school.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 10 points 8 months ago (3 children)

I kinda doubt porn would be a problem with Sora, just like it's not a problem with Dall-e. The model is locked down in openai's servers and open source model is no where as good yet. Even if there is a comparable downloadable models, it'll be computationally expensive I doubt teens can freely run it.

[–] [email protected] 5 points 8 months ago* (last edited 8 months ago) (1 children)

Um… the Taylor Swift porn deepfakes were Dall-e.

Sure - they try to prevent that stuff, but it’s hardly perfect. And not all bullying is easily spotted. Imagine a deepfake of a kid sending a text message, but the bubbles are green. Or maybe they’re smiling at someone they hate.

Also, stable diffusion is more than good enough for this stuff. It’s free and any decent gaming laptop can run that. Takes mine 20 seconds to produce a decent deepfake… I’ve used it to touch up my own photos.

[–] [email protected] 1 points 8 months ago

the Taylor Swift porn deepfakes were Dall-e.

Got a source for that? I only have experience with DALL-E 3, but that is really picky about nudity, copyright and portrait right.

[–] [email protected] 16 points 8 months ago (1 children)

Even if there is a comparable downloadable models, it'll be computationally expensive I doubt teens can freely run it.

For now, but with every new tech, hardware efficiency optimization is not too far behind. Especially considering the performance required for training != performance required for running/outputting.

Considering the glacier speeds our government moves, I'd bet on those hardware efficiency optimizations making it out before any significant law gets implemented.

[–] [email protected] 3 points 8 months ago (1 children)

Wait for ASICs. When I started mining bitcoin on dual core cpus I would have never believed a little USB stick would be doing 100x what 30 of those full sized pcs could. The software behind generative models still needs to be fleshed out for hardware to know what direction to go. But a little AI machine that only draws like 50w max but has 256gb of high speed RAM for model storage is not far away. Anyone with $500(or whatever idk) will be able to order a machine off ebay, load it up with whatever models they want, and start generating whatever they want locally.

[–] [email protected] 1 points 8 months ago

I don't think it'll be as easy as calculating SHA256 hashes, so ASICs as small as this might never be a thing.

On the other hand, brains do use orders of magnitude less power, so who knows.

[–] [email protected] 23 points 8 months ago

open source model is no where as good yet

By the time a law would be adopted, it probably will be. I wouldn't want to rely on the "kindness" of commercial entities as the sole protector of consumer welfare. We've seen how well that works with Google and Facebook.