617
LLM hallucinations
(lemmy.world)
Just post something 💛
Lemmy's general purpose discussion community with no specific topic.
Sitewide lemmy.world rules apply here.
Additionally, this is a no AI content community. We are here for human interaction, not AI slop! Posts or comments flagged as AI generated will be removed.
If this were true, the world would be a far far far better place.
Humans gobble up all sorts of nonsense because they “learnt” it. Same for LLMs.
I'm not saying humans are always aware of when they're correct, merely how confident they are. You can still be confidently wrong and know all sorts of incorrect info.
LLMs aren't aware of anything like self confidence