dir-assistant  by curvedinf

CLI tool to chat with local files using local/API LLMs

created 1 year ago
387 stars

Top 75.2% on sourcepulse

GitHubView on GitHub
Project Summary

This CLI tool allows users to interact with their project's codebase by leveraging Large Language Models (LLMs). It indexes text files within a directory and includes relevant content in prompts sent to local or API-based LLMs, serving as a coding aid and automation tool.

How It Works

The tool employs a novel Contextually Guided Retrieval-Augmented Generation (CGRAG) method to identify and prioritize the most relevant files for inclusion in LLM prompts. This approach aims to enhance the LLM's understanding and response accuracy by dynamically selecting contextually pertinent information. It supports a wide range of local LLM backends (CPU, CUDA, ROCm, Metal, Vulkan, SYCL) and integrates with various LLM APIs via LiteLLM.

Quick Start & Requirements

  • Installation: pip install dir-assistant or pipx install dir-assistant (for Ubuntu 24.04). For local LLM support: pip install dir-assistant[recommended].
  • Prerequisites: Python. For local LLMs, a C compiler is recommended for llama-cpp-python on Windows. Hardware acceleration (CUDA, ROCm, etc.) requires specific drivers and setup. API keys are needed for API-based LLMs.
  • Setup: Basic API setup involves installing, setting an API key, and running. Local LLM setup requires downloading models and potentially configuring hardware acceleration.
  • Docs: https://github.com/curvedinf/dir-assistant

Highlighted Details

  • Supports interactive chat and non-interactive single-prompt modes.
  • Automated file updates and Git commits can be enabled.
  • Extensive configuration options via TOML files or environment variables.
  • Includes a script example for analyzing Reddit stock sentiment.

Maintenance & Community

  • Project sponsored by Blazed.deals.
  • Contributors are acknowledged in CONTRIBUTORS.md.
  • Issues can be reported on GitHub.

Licensing & Compatibility

  • The specific license is not explicitly stated in the README, but the project relies on llama-cpp-python and LiteLLM, which have their own licenses. Compatibility for commercial use would require verifying the licenses of all dependencies.

Limitations & Caveats

  • Currently only detects and reads text files.
  • Direct local LLM use on Windows is not recommended due to llama-cpp-python C compiler requirements; using a separate LLM server like LMStudio is advised.
  • llama-cpp-python may not always support the latest Llama.cpp features or models.
Health Check
Last commit

1 month ago

Responsiveness

1 day

Pull Requests (30d)
0
Issues (30d)
1
Star History
40 stars in the last 90 days

Explore Similar Projects

Feedback? Help us improve.