Local chatbot for coding assistance
Top 86.9% on sourcepulse
This project provides a locally-run, privacy-focused AI coding assistant powered by the DeepSeek-r1 model via Ollama and built with Gradio. It targets developers seeking on-device assistance for Python coding, debugging, documentation, and solution design without relying on external cloud services.
How It Works
The assistant leverages LangChain to orchestrate interactions with the DeepSeek-r1 language model, served locally by Ollama. This architecture allows for 100% local execution, ensuring data privacy and offline usability. The Gradio interface provides a user-friendly chat experience for querying the model.
Quick Start & Requirements
pip install -r requirements.txt
ollama pull deepseek-r1:1.5b
ollama serve
python app.py
http://127.0.0.1:7860
Highlighted Details
Maintenance & Community
Contributions, issues, and feature requests are welcome via the issues page.
Licensing & Compatibility
MIT License. Permissive for commercial use and closed-source linking.
Limitations & Caveats
The project relies on the availability and performance of the DeepSeek-r1 model and Ollama. Specific model versions (1.5b vs. 3b) offer trade-offs between response speed and complexity handling.
6 months ago
Inactive