612
LLM hallucinations (lemmy.world)
you are viewing a single comment's thread
view the rest of the comments
[-] morrowind@lemmy.ml 1 points 2 days ago* (last edited 2 days ago)

The y key difference is humans are aware of what they know and don't know and when they're unsure of an answer. We haven't cracked that for AIs yet.

When AIs do say they're unsure, that's their understanding of the problem, not an awareness of their own knowledge

[-] FundMECFSResearch 1 points 2 days ago

They hey difference is humans are aware of what they know and don't know

If this were true, the world would be a far far far better place.

Humans gobble up all sorts of nonsense because they “learnt” it. Same for LLMs.

[-] morrowind@lemmy.ml 1 points 2 days ago

I'm not saying humans are always aware of when they're correct, merely how confident they are. You can still be confidently wrong and know all sorts of incorrect info.

LLMs aren't aware of anything like self confidence

this post was submitted on 12 May 2025
612 points (100.0% liked)

Just Post

855 readers
7 users here now

Just post something 💛

founded 2 years ago
MODERATORS