Open-source VS Code extension for local code completion
Top 38.6% on sourcepulse
Privy is an open-source VS Code extension providing local, privacy-focused AI code completion and chat functionalities, serving as an alternative to cloud-based services like GitHub Copilot. It empowers developers to leverage large language models (LLMs) directly on their machines for enhanced productivity and data security.
How It Works
Privy integrates with local LLM inference engines like Ollama, llamafile, and llama.cpp. Users select and configure specific LLMs for code completion and chat, with recommendations provided for models like DeepSeek Coder and CodeLlama. The extension facilitates real-time code suggestions and a Copilot-style chat interface for code explanation, debugging, and unit test generation.
Quick Start & Requirements
deepseek-coder:1.3b-base
for low VRAM usage).privy.provider
(e.g., ollama
) and privy.providerUrl
(default http://localhost:11434
) in VS Code settings. Configure privy.autocomplete.model
and privy.model
with chosen LLM names.Highlighted Details
Maintenance & Community
The project acknowledges contributions from multiple individuals and is inspired by RubberDuck AI. A contributing guide and "good first issues" are available for community involvement.
Licensing & Compatibility
The license is not explicitly stated in the README, which may require further investigation for commercial use or closed-source linking.
Limitations & Caveats
The README notes that AI responses may be inaccurate, especially for less common topics or detailed conversations. It advises users not to blindly trust answers and to use separate chat threads for different topics to improve accuracy.
1 year ago
1 day