AI source for nvim-cmp, enabling remote code completion
Top 99.2% on sourcepulse
This plugin provides AI-powered code completion for Neovim, integrating with the nvim-cmp
autocompletion engine. It supports multiple AI providers, including HuggingFace, OpenAI, Codestral, Ollama, and Google Bard, enabling developers to leverage large language models for context-aware suggestions directly within their editor.
How It Works
The plugin acts as a custom source for nvim-cmp
, querying specified AI providers with context from the current buffer (lines before and after the cursor). It handles API interactions, response parsing, and formatting of suggestions for nvim-cmp
. The design allows for flexible configuration of providers, models, and context sensitivity, including custom prompt engineering for specific models.
Quick Start & Requirements
require("lazy").setup({
{'tzachar/cmp-ai', dependencies = 'nvim-lua/plenary.nvim'},
{'hrsh7th/nvim-cmp', dependencies = {'tzachar/cmp-ai'}},
})
nvim-cmp
to include cmp_ai
as a source.nvim-lua/plenary.nvim
.curl
for Codestral, OpenAI, HuggingFace; dsdanielpark/Bard-API
for Google Bard.HF_API_KEY
, OPENAI_API_KEY
).Highlighted Details
max_lines
) and request timeout (max_timeout_seconds
).run_on_every_keystroke
and ignoring specific file types.lspkind
for pretty-printed completion menu items.nvim-cmp
menu.Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
max_timeout_seconds
is not supported for Google Bard.3 months ago
1 day