this post was submitted on 17 May 2025
301 points (94.4% liked)

Technology

70163 readers
3421 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 7 points 3 days ago (2 children)

They don't have decent filters on what they fed the first generation of AI, and they haven't really improved the filtering much since then, because: on the Internet nobody knows you're a dog.

[–] [email protected] 1 points 3 days ago

Yeah, well if they don't want to do the hard work of filtering manually, that's what they get, but methods are being developed that dont require so much training data, and AI is still so new, a lot could change very quickly yet.

[–] [email protected] 4 points 3 days ago (1 children)

when you flood the internet with content you don't want, but can't detect, that is quite difficult

[–] [email protected] 1 points 2 days ago

It is a hard problem. Any "human" based filtering will inevitably introduce bias, and some bias (fact vs fiction masquerading as fact) is desirable. The problem is: human determination of what is fact vs what is opinion is... flawed.