Vim plugin for local LLM code completion and chat
Top 98.7% on SourcePulse
This plugin provides Copilot-like code completion and chat functionality within Vim, leveraging locally run Ollama-compatible LLMs for enhanced privacy and offline capabilities. It targets Vim users who prefer not to switch to NeoVim and want to utilize open-source models like Llama3 or Codellama.
How It Works
The plugin integrates with Ollama via its REST API using Python scripts (complete.py
, chat.py
). These scripts handle code completion and chat interactions, respectively. Vim communicates with these scripts through standard input/output redirection, seamlessly embedding AI-generated suggestions and chat responses directly into the Vim editing experience.
Quick Start & Requirements
Plug 'gergap/vim-ollama'
).httpx
, requests
, jinja2
) via a setup wizard (:Ollama setup
).:help vim-ollama
within Vim.Highlighted Details
vim-fugitive
.Maintenance & Community
No specific contributors, sponsorships, or community links (Discord/Slack) are mentioned in the README.
Licensing & Compatibility
The README does not explicitly state a license. Compatibility for commercial use or closed-source linking is not specified.
Limitations & Caveats
This plugin is strictly for Vim and does not support NeoVim. Adding support for new, unsupported code completion models requires manual creation of JSON configuration files.
3 days ago
1 week