1036
Mayonnaise Rule (files.catbox.moe)
submitted 1 year ago by Gork@lemm.ee to c/196
you are viewing a single comment's thread
view the rest of the comments
[-] megopie 34 points 1 year ago

Yah, people don’t seem to get that LLM can not consider the meaning or logic of the answers they give. They’re just assembling bits of language in patterns that are likely to come next based on their training data.

The technology of LLMs is fundamentally incapable of considering choices or doing critical thinking. Maybe new types of models will be able to do that but those models don’t exist yet.

[-] CurlyMoustache@lemmy.world 13 points 1 year ago* (last edited 1 year ago)

A grown man I work with, he's in his 50s, tells me he asks ChatGPT stuff all the time, and I can't for the life of me figure out why. It is a copycat designed to beat the Turing test. It is not a search engine or Wikipedia, it just gambles it can pass the Turing test after every prompt you give it.

[-] megopie 6 points 1 year ago

People want functioning web searching back, but rather than address issues in the industry breaking an otherwise functional concept, they want a new fancy technology to make the problem go away.

load more comments (7 replies)
load more comments (8 replies)
this post was submitted on 10 Feb 2024
1036 points (100.0% liked)

196

17027 readers
623 users here now

Be sure to follow the rule before you head out.

Rule: You must post before you leave.

^other^ ^rules^

If you have any questions, feel free to contact us on our matrix channel.

founded 2 years ago
MODERATORS