20
New Wizard coder model posted and quantized by TheBloke!
(twitter.com)
Community to discuss about LLaMA, the large language model created by Meta AI.
This is intended to be a replacement for r/LocalLLaMA on Reddit.
Oh wait does ooba support this? Nvm then I'm enjoying using that, I'm just a little lost sometimes haha
I don't know if it does or doesn't, I was just saying those two projects seemed similar: presenting a frontend for running inference on models while the user doesn't necessarily have to know/care what backend is used.
Gotcha, koboldcpp seems to be able to run it, all of it is only a tiny bit confusing :D