453
you are viewing a single comment's thread
view the rest of the comments
[-] Madrigal@lemmy.world 86 points 2 weeks ago

I’ve literally seen someone include “Don’t hallucinate” in an agent’s instructions.

[-] rozodru@piefed.world 40 points 2 weeks ago

Asking Claude to not hallucinate is like telling a person to not breathe. it's gonna happen, and happen conistently.

[-] FrederikNJS@piefed.zip 53 points 2 weeks ago

I think the important bit to understand here is that LLMs are never not hallucinating. But they sometimes happens to hallucinate something correct.

[-] Kirk@startrek.website 35 points 2 weeks ago

This fact of how LLMs work is not at all widespread enough IMO.

this post was submitted on 07 Apr 2026
453 points (100.0% liked)

Fuck AI

6826 readers
1134 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.

founded 2 years ago
MODERATORS