efflux-backend  by isoftstone-data-intelligence-ai

Backend for LLM agent chat client

created 6 months ago
414 stars

Top 71.8% on sourcepulse

GitHubView on GitHub
Project Summary

Efflux-backend provides a robust backend for LLM-powered agent chat clients, focusing on streaming responses and chat history management. It targets developers building conversational AI applications, offering a standardized way to connect with various LLM providers via the Model Context Protocol (MCP) for tool invocation and data access.

How It Works

Efflux acts as an MCP Host, leveraging the Model Context Protocol to abstract LLM interactions. It supports dynamic loading and invocation of MCP-defined tools, allowing agents to access external data and functionality. The architecture is built on FastAPI for asynchronous web serving, SQLAlchemy for database interactions, and Pydantic for data validation, enabling efficient handling of multiple LLM providers and real-time streaming responses.

Quick Start & Requirements

  • Install: pip install uv, then uv sync --reinstall within the cloned repository.
  • Prerequisites: Python 3.12+, PostgreSQL, uv package manager.
  • Setup: Clone the repository, install dependencies with uv, configure .env with database and LLM details, initialize the database with Alembic (alembic upgrade head), and run initialization scripts (python scripts/init_llm_templates.py).
  • Run: python -m uvicorn main:app --host 0.0.0.0 --port 8000
  • Docs: Efflux Homepage

Highlighted Details

  • Supports multiple LLM providers including Qwen, Azure OpenAI, Doubao, and Moonshot.
  • Implements real-time streaming chat responses.
  • Manages chat history.
  • Utilizes the Model Context Protocol for standardized tool invocation.

Maintenance & Community

The project acknowledges contributions from various open-source projects including Langchain, FastAPI, SQLAlchemy, and others. Specific community channels or active maintainer information are not detailed in the README.

Licensing & Compatibility

The project's license is not explicitly stated in the provided README. Compatibility for commercial use or closed-source linking would require clarification on the licensing terms.

Limitations & Caveats

The README does not specify the project's license, which is a critical factor for commercial adoption. Configuration for LLM providers requires direct code modification in core/common/container.py.

Health Check
Last commit

1 month ago

Responsiveness

Inactive

Pull Requests (30d)
0
Issues (30d)
0
Star History
6 stars in the last 90 days

Explore Similar Projects

Starred by Peter Norvig Peter Norvig(Author of Artificial Intelligence: A Modern Approach; Research Director at Google), Chip Huyen Chip Huyen(Author of AI Engineering, Designing Machine Learning Systems), and
2 more.

aisuite by andrewyng

0.2%
12k
Unified interface for multiple generative AI providers
created 1 year ago
updated 3 days ago
Feedback? Help us improve.