Neovim plugin for managing Ollama workflows
Top 59.9% on sourcepulse
This Neovim plugin integrates with Ollama, enabling users to interact with local Large Language Models directly within their editor. It targets Neovim users who want to leverage LLMs for tasks like code generation, text completion, and content summarization without leaving their development environment. The plugin offers flexible configuration for connecting to Ollama servers and defining custom LLM prompts.
How It Works
The plugin communicates with an Ollama server via HTTP, allowing the server to run locally or remotely. Users can select available models, prompt LLMs with context from their current buffer (including selections, filetype, and surrounding text), and configure how responses are handled (display, replace, or insert). It supports custom parameters like temperature and top_k, and allows defining custom prompts with specific actions and response extraction patterns.
Quick Start & Requirements
lazy.nvim
(see README for full configuration example).nvim-lua/plenary.nvim
as a dependency.curl
to be installed on the system for HTTP communication.Highlighted Details
$input
, $sel
, $buf
, etc.).display
, replace
, insert
, display_replace
, display_insert
.stevearc/dressing.nvim
for Telescope prompt selection.lualine.nvim
).Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
The plugin is currently missing planned features such as direct model download/management and chat functionality. The license is not specified, which may impact commercial adoption.
11 months ago
1 week