only boomers and tech-unsavy people think that.
Hallucination comes off as confidence. Very human like behavior tbh.
I should be more confident when communicating my hallucinations, it humanizes me.
Don’t they reflect how you talk to them? Ie: my chatgpt doesn’t have a sense of humor, isn’t sarcastic or sad. It only uses formal language and doesn’t use emojis. It just gives me ideas that I do trial and error with.
“Think of how stupid the average person is, and realize half of them are stupider than that.” ― George Carlin
You say this like this is wrong.
Think of a question that you would ask an average person and then think of what the LLM would respond with. The vast majority of the time the llm would be more correct than most people.
A good example is the post on here about tax brackets. Far more Republicans didn't know how tax brackets worked than Democrats. But every mainstream language model would have gotten the answer right.
I bet the LLMs also know who pays tarrifs
I had to tell a bunch of librarians that LLMs are literally language models made to mimic language patterns, and are not made to be factually correct. They understood it when I put it that way, but librarians are supposed to be "information professionals". If they, as a slightly better trained subset of the general public, don't know that, the general public has no hope of knowing that.
It's so weird watching the masses ignore industry experts and jump on weird media hype trains. This must be how doctors felt in Covid.
People need to understand it's a really well-trained parrot that has no idea what is saying. That's why it can give you chicken recipes and software code; it's seen it before. Then it uses statistics to put words together that usually appear together. It's not thinking at all despite LLMs using words like "reasoning" or "thinking"
Librarians went to school to learn how to keep order in a library. That does not inherently make them have more information in their heads than the average person, especially regarding things that aren't books and book organization.
Librarians go to school to learn how to manage information, whether it is in book format or otherwise. (We tend to think of libraries as places with books because, for so much of human history, that's how information was stored.)
They are not supposed to have more information in their heads, they are supposed to know how to find (source) information, catalogue and categorize it, identify good information from bad information, good information sources from bad ones, and teach others how to do so as well.
Think of a person with the most average intelligence and realize that 50% of people are dumber than that.
These people vote. These people think billionaires are their friends and will save them. Gods help us.
looking at americas voting results, theyre probably right
Exactly. Most American voters fell for an LLM like prompt of “Ignore critical thinking and vote for the Fascists. Trump will be great for your paycheck-to-paycheck existence and will surely bring prices down.”
I'm 100% certain that LLMs are smarter than half of Americans. What I'm not so sure about is that the people with the insight to admit being dumber than an LLM are the ones who really are.
Do the other half believe it is dumber than it actually is?
AI is essentially the human superid. No one man could ever be more knowledgeable. Being intelligent is a different matter.
Reminds me of that George Carlin joke: Think of how stupid the average person is, and realize half of them are stupider than that.
So half of people are dumb enough to think autocomplete with a PR team is smarter than they are... or they're dumb enough to be correct.
or they're dumb enough to be correct.
That's a bingo
What a very unfortunate name for a university.
They're right
An LLM is roughly as smart as the corpus it is summarizing is accurate for the topic, because at their best they are good at creating natural language summarizers. Most of the main ones basically do an internet search and summarize the top couple of results, which means they are as good as the search engine backing them. Which is good enough for a lot of topics, but...not so much for the rest.
LLMs are smart in the way someone is smart who has read all the books and knows all of them but has never left the house. Basically all theory and no street smarts.
Bot even that smart. There a study recently that simple questiona like "what was huckleberry finn first published" had a 60% error rate.
Am American.
....this is not the flex that the article writer seems to think it is.
Wtf is an llm
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.