614
LLM hallucinations (lemmy.world)
you are viewing a single comment's thread
view the rest of the comments
[-] Couldbealeotard@lemmy.world 1 points 2 days ago

Hallucination is the technical term for when the output of an LLM is factually incorrect. Don't confuse that with the normal meaning of the word.

A bug in software isn't an actual insect.

this post was submitted on 12 May 2025
614 points (100.0% liked)

Just Post

855 readers
7 users here now

Just post something ๐Ÿ’›

founded 2 years ago
MODERATORS