rs-graph-llm  by a-agmon

High-performance Rust SDK for interactive AI agent workflows

Created 9 months ago
262 stars

Top 97.1% on SourcePulse

GitHubView on GitHub
Project Summary

A high-performance, type-safe framework for building interactive multi-agent workflow systems in Rust. It combines a core graph execution engine (graph-flow) with Rust-native LLM integration (Rig) to enable complex, stateful AI agent orchestration for production environments, offering benefits like performance, flexibility, and robust state management.

How It Works

The framework is built around two core Rust crates: graph-flow for orchestrating stateful workflows via a graph execution engine, and Rig for seamless LLM agent capabilities. It adopts LangGraph's philosophy of combining graph execution with LLM integration but is implemented from scratch in Rust for enhanced performance and type safety. Workflows are defined as directed graphs where nodes are tasks, and edges represent transitions. The system supports flexible execution models, including step-by-step and continuous execution, with built-in features for session management, context sharing, conditional routing, and human-in-the-loop interactions.

Quick Start & Requirements

  • Primary install: Add graph-flow to your Cargo.toml with the rig feature: graph-flow = {version = "0.2.3", features = ["rig"]}.
  • Prerequisites: Requires LLM API keys (e.g., OPENROUTER_API_KEY environment variable) and optionally a database URL (DATABASE_URL) for PostgreSQL persistence.
  • Examples: The repository includes progressive examples (examples/simple_example.rs, examples/complex_example.rs, examples/recommendation_flow.rs) and production-ready services like insurance-claims-service/ and recommendation-service/.

Highlighted Details

  • LLM Integration: Seamlessly integrates LLM agents using the Rig crate for natural language processing and generation within workflows.
  • Stateful Workflows: Manages complex state across multiple interactions using session management and a pluggable storage abstraction (in-memory, PostgreSQL).
  • Conditional Routing & Parallel Execution: Supports dynamic branching based on runtime data and concurrent execution of tasks via the FanOutTask.
  • Human-in-the-Loop: Built-in support for pausing workflows (NextAction::WaitForInput) to await human intervention or approval.
  • Production Services: Demonstrates real-world applications like insurance claims processing and RAG-based recommendation systems.

Maintenance & Community

The repository does not explicitly detail community channels (like Discord/Slack), notable contributors, sponsorships, or a public roadmap within the provided README.

Licensing & Compatibility

  • License: MIT License.
  • Compatibility: The permissive MIT license generally allows for commercial use and integration into closed-source projects.

Limitations & Caveats

The FanOutTask for parallel execution has limitations: child tasks cannot control flow (e.g., WaitForInput), all children share the same context requiring careful coordination, and it defaults to linear continuation. Customization involves rewriting tasks and modifying the graph structure within the core framework.

Health Check
Last Commit

3 months ago

Responsiveness

Inactive

Pull Requests (30d)
0
Issues (30d)
0
Star History
21 stars in the last 30 days

Explore Similar Projects

Feedback? Help us improve.