14
new qwen architecture? :o
(lemmy.blahaj.zone)
A community all about the Qwens! (LLMs, VLMs, WANs...)
Here their blog page and their free chat interface
Post are allowed to have any format.
It is advised to put "Qwen" into the title somewhere.
They do, that's called a Recurrant model.
And Recurrance is critical for a model to have "memory" of past inputs.
It was one of the key advancements awhile back in data processing for predictive systems, IE speech recognition.
Recurrence is pretty standard now in most neural networks, linear networks are your most basic ones that are mostly just used to demonstrate the basic 101 concepts of ML, they don't have a tonne of practical IRL uses aside from some forms of very basic image processing and stuff like that. Filter functions and etc.
yeaaaa you're right.. i was referring specifically to LLMs, but yes, recurrent models are essentially everywhere else.
i am just surprised we don't have many LLMs with recurrent blocks in them, like this model here did for example. i really hope we go that direction soon...
Afaik all LLMs have very derp recurrance, as that's what provides their context window size.
The more recurrant params they have, the more context window they can store.