191
you are viewing a single comment's thread
view the rest of the comments
[-] masterplan79th@lemmy.world 7 points 2 months ago

When you ask an LLM a reasoning question. You're not expecting it to think for you, you're expecting that it has crawled multiple people asking semantically the same question and getting semantically the same answer, from other people, that are now encoded in its vectors.

That's why you can ask it. because it encodes semantics.

[-] ebu@awful.systems 24 points 2 months ago

because it encodes semantics.

if it really did so, performance wouldn't swing up or down when you change syntactic or symbolic elements of problems. the only information encoded is language-statistical

[-] self@awful.systems 23 points 2 months ago

thank you for bravely rushing in and providing yet another counterexample to the “but nobody’s actually stupid enough to think they’re anything more than statistical language generators” talking point

[-] vrighter@discuss.tchncs.de 19 points 2 months ago

so.... a stochastic parrot?

[-] blakestacey@awful.systems 17 points 2 months ago

Rooting around for that Luke Skywalker "every single word in that sentence was wrong" GIF....

[-] froztbyte@awful.systems 16 points 2 months ago

did you ask a LLM for a post to make here? that might explain this mess of a comment

[-] V0ldek@awful.systems 14 points 2 months ago

because it encodes semantics.

Please enlighten me on how? I admit I don't know all the internals of the transformer model, but from what I know it encodes precisely only syntactical information, i.e. what next syntactical token is most likely to follow based on a syntactical context window.

How does it encode semantics? What is the semantics that it encodes? I doubt they have denatotational or operational semantics of natural language, I don't think something like that even exists, so it has to be some smaller model. Actually, it would be enlightening if you could tell me at least what the semantical domain here is, because I don't think there's any naturally obvious choice for that.

[-] sc_griffith@awful.systems 14 points 2 months ago* (last edited 2 months ago)

guy who totally gets what these words mean: "an llm simply encodes the semantics into the vectors"

[-] self@awful.systems 15 points 2 months ago

all you gotta do is, you know, ground the symbols, and as long as you’re writing enough Lisp that should be sufficient for GAI

[-] froztbyte@awful.systems 11 points 2 months ago

both your comments made my eye twitch

like what’d happen if bob fucked up the symbols in a pentacle

[-] froztbyte@awful.systems 10 points 2 months ago

also why do we need getaddrinfo? the promptfans will always readily tell you who they are

this post was submitted on 13 Oct 2024
191 points (100.0% liked)

TechTakes

1489 readers
31 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS