VS Code extension for local LLM-assisted code completion
Top 42.4% on sourcepulse
This VS Code extension provides local LLM-assisted code and text completion, targeting developers who want to leverage large language models for enhanced productivity without relying on cloud services. It offers features like auto-suggestion, flexible acceptance shortcuts, and manual suggestion toggling, aiming for high-quality, performant completions even on consumer hardware.
How It Works
The extension integrates with a running llama.cpp
server instance, which handles the actual LLM inference. It supports "fill-in-the-middle" (FIM) compatible models, enabling it to suggest code or text based on the context around the cursor. A key advantage is its "smart context reuse" mechanism, which allows it to handle very large contexts efficiently, even on lower-end hardware, by intelligently reusing and chunking context from open files and yanked text.
Quick Start & Requirements
llama-vscode
extension from the VS Code marketplace or Open VSX.llama.cpp
server instance. Installation of llama.cpp
can be done via brew install llama.cpp
on macOS or by building from source for other OSes.llama-server
settings vary based on VRAM availability. CPU-only configurations are also provided.llama.cpp
setup and model download.Highlighted Details
Maintenance & Community
The project is associated with the ggml-org
organization, known for llama.cpp
. Further community and roadmap details are not explicitly provided in the README.
Licensing & Compatibility
The README does not explicitly state a license for the VS Code extension itself. The underlying llama.cpp
project is typically licensed under a permissive license like MIT, but users should verify the specific license for both components.
Limitations & Caveats
The extension's functionality is entirely dependent on a correctly configured and running llama.cpp
server. Performance and quality will vary significantly based on the chosen LLM and hardware capabilities, especially for CPU-only configurations.
1 day ago
1 day