are-copilots-local-yet  by ErikBjare

Curated list of local LLM copilots for code completion and more

created 2 years ago
572 stars

Top 57.2% on sourcepulse

GitHubView on GitHub
1 Expert Loves This Project
Project Summary

This repository serves as a curated, community-driven directory of open-source, locally runnable Large Language Models (LLMs) and associated tools for code completion, project generation, and shell assistance. It aims to provide developers with a comprehensive overview of the rapidly evolving landscape of local AI coding assistants, enabling offline, private, and potentially more context-aware development workflows compared to cloud-based solutions.

How It Works

The project categorizes various components of the local AI coding ecosystem, including editor extensions (e.g., VSCode plugins), standalone tools for project generation, chat interfaces with shell/REPL capabilities, and specific LLM models optimized for coding tasks. It highlights projects that enable local execution, offering benefits like offline access, improved responsiveness, and the potential for specialized model usage. The curation emphasizes projects that are actively maintained and demonstrate promising capabilities in replicating or exceeding the functionality of commercial offerings like GitHub Copilot.

Quick Start & Requirements

A suggested setup involves installing LM Studio and the Continue.dev VSCode extension. Users then download LLM models via LM Studio, start its local server, and configure the Continue.dev extension to connect to these models. Specific model recommendations include Qwen 2.5 Coder for autocomplete and Deepseek R1 for chat, with quantization levels dependent on available hardware (e.g., M2 Macbook Pro with 32GB RAM).

Highlighted Details

  • Comprehensive listing of editor extensions, project generation tools, chat interfaces, and relevant LLM models.
  • Detailed comparison tables with stars, release dates, and notes for each category.
  • Includes a "Suggested Setup" section for a quick start using LM Studio and Continue.dev.
  • Highlights the advantages of local copilots: offline use, privacy, improved responsiveness, and specialized model capabilities.

Maintenance & Community

The repository is community-driven, with an invitation for users to contribute and keep the list updated. It includes links to a tweet announcing the repo and a stargazing statistics graph.

Licensing & Compatibility

The README does not explicitly state a license for the curated list itself. Individual projects mentioned within the list will have their own licenses, which may vary and could impact commercial use or closed-source integration.

Limitations & Caveats

The quality of local copilot output is noted as not yet on par with cloud-based services. The rapid pace of LLM releases means the "Models" section is prone to becoming outdated quickly. Some listed projects may be stale or proof-of-concept.

Health Check
Last commit

6 months ago

Responsiveness

Inactive

Pull Requests (30d)
0
Issues (30d)
0
Star History
23 stars in the last 90 days

Explore Similar Projects

Feedback? Help us improve.