612
LLM hallucinations (lemmy.world)
you are viewing a single comment's thread
view the rest of the comments
[-] 4am@lemm.ee 52 points 3 days ago

AIs do not hallucinate. They do not think or feel or experience. They are math.

Your brain is a similar model, exponentially larger, that is under constant training from the moment you exist.

Neural-net AIs are not going to meet their hype. Tech bros have not cracked consciousness.

Sucks to see what could be such a useful tool get misappropriated by the hype machine for like cheating on college papers and replacing workers and deepfaking porn of people who aren’t willing subjects because it’s being billed as the ultimate, do-anything software.

[-] WhatsTheHoldup@lemmy.ml 8 points 1 day ago

AIs do not hallucinate.

Yes they do.

They do not think or feel or experience. They are math.

Oh, I think you misunderstand what hallucinations mean in this context.

AIs (LLMs) train on a very very large dataset. That's what LLM stands for, Large Language Model.

Despite how large this training data is, you can ask it things outside the training set and it will answer as confidently as things inside it's dataset.

Since these answers didn't come from anywhere in training, it's considered to be a hallucination.

[-] Couldbealeotard@lemmy.world 1 points 1 day ago

Hallucination is the technical term for when the output of an LLM is factually incorrect. Don't confuse that with the normal meaning of the word.

A bug in software isn't an actual insect.

[-] frezik@midwest.social 3 points 2 days ago* (last edited 2 days ago)

They do hallucinate, and we can induce it to do so much the way certain drugs induce hallucinations in humans.

However, it's slightly different from simply being wrong about things. Consciousness is often conflated with intelligence in our language, but they're different things. Consciousness is about how you process input from your senses.

Human consciousness is highly tuned to recognize human faces. So much so that we often recognize faces in things that aren't there. It's the most common example of pareidolia. This is essentially an error in consciousness--a hallucination. You have them all the time even without some funny mushrooms.

We can induce pareidolia in image recognition models. Google did this in the Deep Dream model. It was trained to recognize dogs, and then modify the image to put in the thing it recognizes. After a few iterations of this, it tends to stick dogs all over the image. We made an AI that has pareidolia for dogs.

There is some level of consciousness there. It's not a binary yes/no thing, but a range of possibilities. They don't have a particularly high level of consciousness, but there is something there.

[-] turtlesareneat@discuss.online 15 points 3 days ago

You don't need it to be conscious to replace people's jobs, however poorly, tho. The hype of disruption and unemployment may yet come to pass, if the electric bills are ultimately cheaper than the employees, capitalism will do its thing.

If anyone has doubts - please see everything about the history and practice of outsourcing.

They don't care if quality plummets. They don't even understand how quality could plummet. So many call centers, customer service reps, and IT departments have been outsourced to the cheapest possible overseas vendor, and everyone in the company recognizes how shitty it is, and some even reccognize that it is a net loss in the long term.

But human labor is nothing but a line item on a spreadsheet, and if they think they can keep the revenue flowing while reducing that expenditure so that they can increase short term profit margins, they will.

No further questions, they will do it. And everyone outside of the C-suite and its sycophants - from the consumer, to the laid-off employee, to the few remaining employees that have to work around it - everyone hates it.

But the company clearly makes more money, because the managers take credit for reductions in workforce (an easily quantifiable $$ amount) and then make up whatever excuses they need for downstream reductions in revenue (a much more complex calculation that can usually be blamed on things like "the economy").

That's assuming they even have reductions in revenue, which monopolies obviously don't suffer no matter what bullshit they pull and no matter how shitty their service is.

[-] FlashMobOfOne@lemmy.world 7 points 2 days ago

Fun fact, though.

Some business that use AI for their customer service chatbots have shitty ones that will give you discounts if you ask. I bought a new mattress a year ago and asked the chatbot if they had any discounts on x model and if they'd include free delivery, and it worked.

this post was submitted on 12 May 2025
612 points (100.0% liked)

Just Post

855 readers
10 users here now

Just post something 💛

founded 2 years ago
MODERATORS