Autogen_GraphRAG_Ollama  by karthik-codex

Multi-agent RAG "superbot" using local LLMs

Created 1 year ago
745 stars

Top 46.6% on SourcePulse

GitHubView on GitHub
Project Summary

This project provides a fully local, free, multi-agent Retrieval-Augmented Generation (RAG) system by integrating Microsoft's GraphRAG with AutoGen, Ollama for local LLMs, and Chainlit for a conversational UI. It targets developers and researchers seeking to build sophisticated RAG applications without relying on external APIs or cloud services, enabling offline and private AI interactions.

How It Works

The system leverages GraphRAG's knowledge graph capabilities for efficient information retrieval, enhanced by AutoGen's multi-agent framework for complex task execution. It achieves offline LLM support by configuring GraphRAG to use local models from Ollama for both inference and embeddings. A key innovation is extending AutoGen's function calling to work with non-OpenAI LLMs via a Lite-LLM proxy, enabling seamless integration with Ollama's diverse model ecosystem. A Chainlit UI provides an interactive, continuous conversation experience.

Quick Start & Requirements

  • Installation: Clone the repository, create a Conda environment (conda create -n RAG_agents python=3.12), activate it, install dependencies (pip install -r requirements.txt).
  • LLMs: Requires Ollama installation and pulling models like mistral, nomic-embed-text, and llama3 (ollama pull <model_name>).
  • Setup: Initialize GraphRAG (python -m graphrag.index --init --root .), copy utility files, create embeddings (python -m graphrag.index --root .), start Lite-LLM proxy (litellm --model ollama_chat/llama3), and run the UI (chainlit run appUI.py).
  • Resources: Requires Python 3.12, Conda, Git, Ollama, and sufficient disk space for models and embeddings. GPU acceleration is recommended for performance.
  • Docs: Medium.com Guide

Highlighted Details

  • Integrates GraphRAG's knowledge search with AutoGen agents via function calling.
  • Supports offline LLM inference and embedding using Ollama models.
  • Enables non-OpenAI function calling for AutoGen through Lite-LLM.
  • Provides an interactive, multi-threaded conversational UI with Chainlit.

Maintenance & Community

The project is maintained by karthik-codex. Community channels or roadmaps are not explicitly mentioned in the README.

Licensing & Compatibility

The repository's license is not specified in the README. Compatibility for commercial use or closed-source linking would depend on the underlying licenses of GraphRAG, AutoGen, Ollama, and Lite-LLM.

Limitations & Caveats

The README indicates that specific files within the GraphRAG package need to be manually replaced, suggesting potential installation complexities or a need for patching. The project appears to be a custom integration rather than an officially supported distribution of these components.

Health Check
Last Commit

1 year ago

Responsiveness

Inactive

Pull Requests (30d)
0
Issues (30d)
0
Star History
13 stars in the last 30 days

Explore Similar Projects

Feedback? Help us improve.