14
Easy to setup locally hosted LLM with access to file system
(programming.dev)
Welcome to Free Open-Source Artificial Intelligence!
We are a community dedicated to forwarding the availability and access to:
Free Open Source Artificial Intelligence (F.O.S.A.I.)
Look into setting up the "continue" plugin in vs code. It supports an ollama backend and can even do embeddings if setup correctly. That means it will try to select files itself based on your question which helps with prompt size. Here is a link to get started, you might need to choose smaller models with your card.
https://ollama.com/blog/continue-code-assistant