612
LLM hallucinations (lemmy.world)
you are viewing a single comment's thread
view the rest of the comments
[-] WhatsTheHoldup@lemmy.ml 8 points 2 days ago

AIs do not hallucinate.

Yes they do.

They do not think or feel or experience. They are math.

Oh, I think you misunderstand what hallucinations mean in this context.

AIs (LLMs) train on a very very large dataset. That's what LLM stands for, Large Language Model.

Despite how large this training data is, you can ask it things outside the training set and it will answer as confidently as things inside it's dataset.

Since these answers didn't come from anywhere in training, it's considered to be a hallucination.

this post was submitted on 12 May 2025
612 points (100.0% liked)

Just Post

855 readers
9 users here now

Just post something ๐Ÿ’›

founded 2 years ago
MODERATORS