CLI tool to chat with local files using local/API LLMs
Top 75.2% on sourcepulse
This CLI tool allows users to interact with their project's codebase by leveraging Large Language Models (LLMs). It indexes text files within a directory and includes relevant content in prompts sent to local or API-based LLMs, serving as a coding aid and automation tool.
How It Works
The tool employs a novel Contextually Guided Retrieval-Augmented Generation (CGRAG) method to identify and prioritize the most relevant files for inclusion in LLM prompts. This approach aims to enhance the LLM's understanding and response accuracy by dynamically selecting contextually pertinent information. It supports a wide range of local LLM backends (CPU, CUDA, ROCm, Metal, Vulkan, SYCL) and integrates with various LLM APIs via LiteLLM.
Quick Start & Requirements
pip install dir-assistant
or pipx install dir-assistant
(for Ubuntu 24.04). For local LLM support: pip install dir-assistant[recommended]
.llama-cpp-python
on Windows. Hardware acceleration (CUDA, ROCm, etc.) requires specific drivers and setup. API keys are needed for API-based LLMs.Highlighted Details
Maintenance & Community
CONTRIBUTORS.md
.Licensing & Compatibility
llama-cpp-python
and LiteLLM
, which have their own licenses. Compatibility for commercial use would require verifying the licenses of all dependencies.Limitations & Caveats
llama-cpp-python
C compiler requirements; using a separate LLM server like LMStudio is advised.llama-cpp-python
may not always support the latest Llama.cpp features or models.1 month ago
1 day