612
LLM hallucinations (lemmy.world)
you are viewing a single comment's thread
view the rest of the comments
[-] morrowind@lemmy.ml 3 points 2 days ago

And how do you think it predicts that? All that complex math can be clustered into higher level structures. One could almost call it.. thinking.

Besides we have reasoning models now, so they can emulate thinking if nothing else

[-] merc@sh.itjust.works 1 points 2 days ago

One could almost call it.. thinking

No, one couldn't, unless one was trying to sell snake oil.

so they can emulate thinking

No, they can emulate generating text that looks like text typed up by someone who was thinking.

[-] morrowind@lemmy.ml 1 points 2 days ago

What do you define as thinking if not a bunch of signals firing in your brain?

[-] merc@sh.itjust.works 1 points 1 day ago

Yes, thinking involves signals firing in your brain. But, not just any signals. Fire the wrong signals and someone's having a seizure not thinking.

Just because LLMs generate words doesn't mean they're thinking. Thinking involves reasoning and considering something. It involves processing information, storing memories, then bringing them up later as appropriate. We know LLMs aren't doing that because we know what they are doing, and what they're doing is simply generating the next word based on previous words.

this post was submitted on 12 May 2025
612 points (100.0% liked)

Just Post

855 readers
9 users here now

Just post something ๐Ÿ’›

founded 2 years ago
MODERATORS