Neovim plugin for interacting with LLMs and building editor-integrated prompts
Top 75.9% on sourcepulse
model.nvim provides Neovim integration for Large Language Models (LLMs), enabling completions and chat directly within the editor. It targets users who want to customize prompts, experiment with various LLM providers, or utilize local models, offering programmatic prompt building and streaming responses.
How It Works
The plugin is provider-agnostic, supporting OpenAI, Google PaLM, Together, Hugging Face, Llama.cpp, and Ollama, with an easy interface for adding custom providers. Prompts are defined in Lua, allowing for programmatic customization, asynchronous execution, and multi-step workflows. Responses can be streamed directly into the buffer, appended, replaced, or inserted, with features like chat in a dedicated mchat
filetype buffer.
Quick Start & Requirements
lazy.nvim
(example provided in README).curl
.mchat
buffers requires :TSInstall mchat
.Highlighted Details
mchat
filetype for multi-turn conversations, with support for saving and resuming chats.mchat
buffers.Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
1 week ago
1 day