Neovim plugin for LLM-powered code completion
Top 36.2% on sourcepulse
This Neovim plugin provides LLM-powered code completion and generation, inspired by GitHub Copilot. It targets developers seeking AI assistance within their editor, offering "ghost-text" suggestions and configurable backend integrations for various LLM hosting solutions.
How It Works
The plugin leverages llm-ls
as its backend Language Server. It supports multiple backends including Hugging Face Inference API, Ollama, OpenAI, and Text Generation Inference (TGI). Requests are made via HTTP, with configurations for model selection, API tokens, and custom request bodies. The plugin intelligently manages context window limits using tokenizers, ensuring prompts fit the model's constraints.
Quick Start & Requirements
packer.nvim
, lazy.nvim
, or vim-plug
.llm-ls
is automatically downloaded or can be installed via mason.nvim
.Highlighted Details
Maintenance & Community
llm-ls
binary management is included.Licensing & Compatibility
Limitations & Caveats
The Hugging Face Inference API free tier has rate limits; a PRO subscription is recommended. The plugin's behavior and model compatibility depend heavily on the chosen backend and model configuration.
6 months ago
1 week