this post was submitted on 12 May 2025
630 points (98.8% liked)
Just Post
863 readers
84 users here now
Just post something ๐
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I'm not saying humans are always aware of when they're correct, merely how confident they are. You can still be confidently wrong and know all sorts of incorrect info.
LLMs aren't aware of anything like self confidence