20
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 11 Aug 2025
20 points (100.0% liked)
TechTakes
2116 readers
29 users here now
Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.
This is not debate club. Unless it’s amusing debate.
For actually-good tech, you want our NotAwfulTech community
founded 2 years ago
MODERATORS
I don't have any real input from prompfondlers, as I don't think I follow enough of them to get a real feeling of them. I did find it interesting that I saw on bsky just now somebody claim that LLMs hallucinate a lot less and that anti-AI people are not taking that into account, and somebody else posting research showing that hallucinations are now harder to spot. (It made up actual real references to thinks, aka works that really exist, only the thing the LLM references wasn't in the actual reference). Which was a bit odd to see. (It does make me suspect 'it hallucinates less' is them just working out special exceptions for every popular hallucination we see, and not a structural fixing of the hallucination problem (which I think is prob not solvable)).