191
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 13 Oct 2024
191 points (100.0% liked)
TechTakes
1489 readers
30 users here now
Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.
This is not debate club. Unless it’s amusing debate.
For actually-good tech, you want our NotAwfulTech community
founded 2 years ago
MODERATORS
Please enlighten me on how? I admit I don't know all the internals of the transformer model, but from what I know it encodes precisely only syntactical information, i.e. what next syntactical token is most likely to follow based on a syntactical context window.
How does it encode semantics? What is the semantics that it encodes? I doubt they have denatotational or operational semantics of natural language, I don't think something like that even exists, so it has to be some smaller model. Actually, it would be enlightening if you could tell me at least what the semantical domain here is, because I don't think there's any naturally obvious choice for that.