this post was submitted on 09 Dec 2024
1775 points (99.3% liked)

Technology

60112 readers
2091 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
1775
submitted 2 weeks ago* (last edited 2 weeks ago) by [email protected] to c/[email protected]
 

Bluesky Post (this was also posted on twitter)

I was hoping to find a statement from the aggressor, but it seems to be too early.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 28 points 2 weeks ago (5 children)

That's fucking crazy. Anyone notice how AI has only made everything shittier?

[–] [email protected] 1 points 2 weeks ago

It's really just the DMCA.

This kind of faultless takedown shouldn't be legal, but the DMCA carved it out decades ago.

[–] [email protected] 1 points 2 weeks ago* (last edited 2 weeks ago)

yes, maybe AI winter was a good thing

[–] [email protected] 2 points 2 weeks ago (1 children)

AI is a great tool. But tools need to still be handled by humans. AI should compile a list of sites "infringing" and this list should be checked by a human. There always should be a human filter with all AI uses. Cool, let your stuff be written by AI but check what is written and fix weird or wrong shit. Have the meat be done by AI and the details by humans.

[–] [email protected] 1 points 2 weeks ago

If a copyright bot makes false DMCA notices, the programmers should be held responsible as if they sent out the notices. That's how we'd handle it for any other crime.

[–] [email protected] 5 points 2 weeks ago

Why develop and create something in a vacuum when you can beta test it on everyone.

[–] [email protected] 11 points 2 weeks ago

It's automating the enshittification. A large language model doesn't need to sleep and doesn't have a conscience.

AI as we know it now is, in a nut shell, the automation of "I was just following orders". Or a digital factory line of evil. Either way, this is about removing the human element from as many decisions as possible.

It would be tricky, unethical, and in some cases illegal to get people to do the sort of things the owner class and fascists want to do to society. But it's easy to let an AI program go nuts. The cruelty is the point in the case of the fascists. And in the case of the owner class it clears out anyone who couldn't afford a lawyer.