this post was submitted on 06 Aug 2024
78 points (96.4% liked)
Apple
605 readers
61 users here now
There are a couple of community rules in addition to the main instance rules.
All posts must be about Apple
Anything goes as long as it’s about Apple. News about other companies and devices is allowed if it directly relates to Apple.
No NSFW content
While lemmy.zip allows NSFW content this community is intended to be a place for all to feel welcome. Any NSFW content will be removed and the user banned.
If you have any comments or suggestions please message one of the moderators.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
You can't tell an LLM to not hallucinate, that would require it to actually understand what it's saying. "Hallucinations" are just LLMs bullshitting, because that's what they do. LLMs aren't actually intelligent they're just using statistics to remix existing sentences.
I wish people would say machine learning or LLMs more frequently instead of AI being the buzzword. It really irks me. IT'S NOT ACCURATE! THAT'S NOT WHAT IT IS! STOP DEMEANING TRUE MACHINE CONSCIOUSNESS!
Gotta let the people know they're getting scammed with false advertising