617
LLM hallucinations (lemmy.world)
you are viewing a single comment's thread
view the rest of the comments
[-] FundMECFSResearch 1 points 7 months ago

They hey difference is humans are aware of what they know and don't know

If this were true, the world would be a far far far better place.

Humans gobble up all sorts of nonsense because they “learnt” it. Same for LLMs.

[-] morrowind@lemmy.ml 1 points 7 months ago

I'm not saying humans are always aware of when they're correct, merely how confident they are. You can still be confidently wrong and know all sorts of incorrect info.

LLMs aren't aware of anything like self confidence

this post was submitted on 12 May 2025
617 points (100.0% liked)

Just Post

1212 readers
154 users here now

Just post something 💛

Lemmy's general purpose discussion community with no specific topic.

Sitewide lemmy.world rules apply here.

Additionally, this is a no AI content community. We are here for human interaction, not AI slop! Posts or comments flagged as AI generated will be removed.

founded 2 years ago
MODERATORS