GraphRAG tool for local LLMs, featuring indexing, prompt tuning, and querying UIs
Top 20.9% on sourcepulse
This project provides a local, UI-driven ecosystem for GraphRAG, enabling users to build and query knowledge graphs using local LLMs. It targets developers and researchers seeking a cost-effective, self-hosted solution for advanced RAG applications, offering a comprehensive suite of tools for indexing, prompt tuning, querying, and visualization.
How It Works
The system is built around a FastAPI-based core API that orchestrates all GraphRAG operations. It supports local LLMs and embedding models via Ollama or OpenAI-compatible APIs, eliminating cloud dependencies. A separate Gradio UI handles indexing and prompt tuning, while a legacy UI offers visualization and querying, with plans for a dedicated querying/chat UI. This modular, API-centric architecture promotes flexibility and maintainability.
Quick Start & Requirements
pip install -e ./graphrag
followed by pip install -r requirements.txt
.python api.py
), optionally embedding proxy (python embedding_proxy.py
), then UIs (gradio index_app.py
, gradio app.py
).Highlighted Details
Maintenance & Community
The project is actively developed, though updates have been slowed by the author's day job. Contributions and PRs are encouraged. Users can report issues and request features via GitHub Issues.
Licensing & Compatibility
The project's license is not explicitly stated in the README. Compatibility for commercial use or closed-source linking is not specified.
Limitations & Caveats
The project is undergoing a major transition with separate UIs being developed, potentially leading to instability. It has been primarily tested on macOS (M2 Studio), and Windows users may encounter encoding issues. Support for additional file formats and advanced graph analysis tools is planned but not yet implemented.
8 months ago
1 day