614
LLM hallucinations (lemmy.world)
you are viewing a single comment's thread
view the rest of the comments
[-] frezik@midwest.social 3 points 2 days ago* (last edited 2 days ago)

They do hallucinate, and we can induce it to do so much the way certain drugs induce hallucinations in humans.

However, it's slightly different from simply being wrong about things. Consciousness is often conflated with intelligence in our language, but they're different things. Consciousness is about how you process input from your senses.

Human consciousness is highly tuned to recognize human faces. So much so that we often recognize faces in things that aren't there. It's the most common example of pareidolia. This is essentially an error in consciousness--a hallucination. You have them all the time even without some funny mushrooms.

We can induce pareidolia in image recognition models. Google did this in the Deep Dream model. It was trained to recognize dogs, and then modify the image to put in the thing it recognizes. After a few iterations of this, it tends to stick dogs all over the image. We made an AI that has pareidolia for dogs.

There is some level of consciousness there. It's not a binary yes/no thing, but a range of possibilities. They don't have a particularly high level of consciousness, but there is something there.

this post was submitted on 12 May 2025
614 points (100.0% liked)

Just Post

855 readers
7 users here now

Just post something ๐Ÿ’›

founded 2 years ago
MODERATORS