VS Code extension that allows you to chat with self hosted models offline that can be downloaded from ollama.
- Chat with a model
- Chat with a selection
- Add file as context to the chat
-
Install Ollama and download a model.
ollama run qwen2.5-coder
-
Open terminal and run
ollama serve
or manually open Ollama app -
Open the command palette in VSCode by pressing Cmd+Shift+P (Mac) or Ctrl+Shift+P (Windows/Linux), then run the Ollama Chat command. This will open the chat window shown in the screenshot.
- feat: show error if user does not have ollama started either manually via opening the app or using
ollama serve
- feat: show error if user does not have a model. Show them example command to install model
- feat: restrict user to certain number of tokens when sending message ?
- feat: processes pdfs ?
- feat: audio search
- build: Do we need to use a build system like web pack ?