this post was submitted on 12 May 2025
630 points (98.8% liked)

Just Post

863 readers
66 users here now

Just post something ๐Ÿ’›

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[โ€“] [email protected] 0 points 1 week ago

Hallucination is the technical term for when the output of an LLM is factually incorrect. Don't confuse that with the normal meaning of the word.

A bug in software isn't an actual insect.