A lightweight, offline, privacy-respecting AI coding assistant integrated directly into your terminal Vim.
Powered by Ollama and local models likemistral,llama3, andcodellama.
No internet. No telemetry. 100% local.
- Trigger the assistant with
\awhile selecting code in visual mode - Preview AI-generated code with
vimdiff - Accept or reject changes interactively
- Automatically replaces selected code if approved
- Displays a detailed explanation in a vertical split (formatted as comments)
- Smart comment formatting based on filetype (
//,#, etc.) - Works entirely offline — no OpenAI or cloud APIs required
git clone https://github.com/agace/vim-ai-assistant.git
cd vim-ai-assistantpip install -r requirements.txtMake sure Ollama is installed and running. Then install a model (e.g. mistral):
ollama pull mistral
ollama serveYou can replace
mistralwith any compatible model likellama3,codellama, etc. Just make sure theMODELvariable inassistant.pymatches the model you pulled. Ollama currently supports a variety of LLMs, a full list is available here: https://ollama.ai/library
cat .vimrc-snippet >> ~/.vimrcThen reload Vim:
:source ~/.vimrc- Open a file in Vim
- Select a block of code in visual mode
- Press
\a - Enter a custom instruction (e.g.
"simplify this","optimize performance","fix and explain") - You’ll see:
- A diff preview side-by-side (via vimdiff)
- Press
yto accept the changes - Press
nto cancel and leave the code untouched
- After confirming, an explanation opens in a vertical split with comments
-
Change the AI model: Edit the
MODELvariable inassistant.pyto use a different local model (e.g.,llama3,codellama). -
Customize the keybinding: The default keybinding is
\ain visual mode. You can change it in your.vimrc:
vnoremap <leader>a :<C-U>call AIHelper()<CR>-
Customizing the Prompt: To change how the assistant behaves, edit
prompt.txt.
This is the system message sent to the AI, it controls formatting, tone, and expectations. -
Change the temp file location (optional): You can edit
assistant.pyorAIHelper()in.vimrc-snippetto change where input/output files are written.
-
Nothing happens after pressing
\a
Ensure you’re in visual mode and have selected code. Then press\a. -
Error: command not found: ollama
Ollama must be installed and running. See: https://ollama.com -
Model not found
Make sure you’ve runollama pull mistral(or your desired model), and thatassistant.pyuses the correct name.