917

Source

I see Google's deal with Reddit is going just great...

you are viewing a single comment's thread
view the rest of the comments
[-] xantoxis@lemmy.world 9 points 11 months ago

TBH I'm curious what the difference between this and "hallucinating" would be.

[-] milicent_bystandr@lemm.ee 6 points 11 months ago

I think 'hallucinating' means when it makes up the source/idea by (effectively) word association that generates the concept, rather than here it's repeating a real source.

[-] PersonalDevKit@aussie.zone 6 points 11 months ago* (last edited 11 months ago)

Couldn't that describe 95% of what LLMs?

It is a really good auto complete at the end of the day, just some times the auto complete gets it wrong

[-] milicent_bystandr@lemm.ee 3 points 11 months ago

Yes, nicely put! I suppose 'hallucinating' is a description of when, to the reader, it appears to state a fact but that fact doesn't at all represent any fact from the training data.

[-] echodot@feddit.uk 3 points 11 months ago

Well it's referencing something so the problem is the data set not an inherent flaw in the AI

[-] dgerard@awful.systems 15 points 11 months ago

i'm pretty sure that referencing this indicates an inherent flaw in the AI

[-] echodot@feddit.uk 1 points 11 months ago

No it represents an inherent flaw in the people developing the AI.

That's a totally different thing. Concept is not flawed the people implementing the concept are.

[-] ebu@awful.systems 11 points 11 months ago* (last edited 11 months ago)

"Of course, this flexibility that allows for anything good and popular to be part of a natural, inevitable precursor to the true metaverse, simultaneously provides the flexibility to dismiss any failing as a failure of that pure vision, rather than a failure of the underlying ideas themselves. The metaverse cannot fail, you can only fail to make the metaverse."

-- Dan Olson, The Future is a Dead Mall

[-] dgerard@awful.systems 8 points 11 months ago
[-] Ultraviolet@lemmy.world 13 points 11 months ago* (last edited 11 months ago)

The inherent flaw is that the dataset needs to be both extremely large and vetted for quality with an extremely high level of accuracy. That can't realistically exist, and any technology that relies on something that can't exist is by definition flawed.

this post was submitted on 23 May 2024
917 points (100.0% liked)

TechTakes

1804 readers
108 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS