agent-protocol  by langchain-ai

API protocol for LLM agent production deployment

created 8 months ago
409 stars

Top 72.3% on sourcepulse

GitHubView on GitHub
Project Summary

This project defines a framework-agnostic API protocol for serving Large Language Model (LLM) agents in production. It aims to standardize interactions around core concepts like runs (stateless or background executions), threads (multi-turn conversations with state management), and a key-value store for long-term memory. The protocol is designed for developers building and deploying LLM-powered applications, offering a consistent interface for agent execution, state persistence, and introspection.

How It Works

The protocol is built around RESTful endpoints defined in an OpenAPI specification. It separates agent execution into "runs" which can be ephemeral (stateless, single-shot) or persistent (associated with a thread). Threads provide a mechanism for managing conversational history, state revisions, and concurrency control. A key-value "store" API allows agents to persist and retrieve arbitrary data, enabling long-term memory capabilities. Introspection endpoints enable querying agent capabilities and schemas.

Quick Start & Requirements

  • Install/Run: The README provides OpenAPI and JSON specs, along with Python server stubs (FastAPI/Pydantic V2) and a JavaScript implementation (LangGraph.js).
  • Resources: Links to OpenAPI docs, JSON spec, Python server stubs, and LangGraph.js API are provided.

Highlighted Details

  • Defines distinct APIs for stateless runs (/runs/wait, /runs/stream) and multi-turn threads (/threads, /threads/{thread_id}/runs).
  • Includes endpoints for agent introspection (/agents/search, /agents/{agent_id}/schemas).
  • Features a key-value store API (/store/items, /store/namespaces) for persistent memory.
  • Supports various run execution paradigms: fire-and-forget, waiting, streaming, and cancellation.

Maintenance & Community

  • LangGraph Platform is a commercial implementation that extends this protocol.
  • Community implementations are welcomed.
  • Roadmap items include detailed stream mode specifications and vector search for memory.

Licensing & Compatibility

  • The protocol itself is not explicitly licensed in the README, but implementations like LangGraph.js are open-source. The Python server stubs are likely under a permissive license.

Limitations & Caveats

  • The protocol is a specification; actual implementations may vary.
  • Concurrency handling for runs on the same thread is currently restricted but planned for future optional support.
  • Detailed specifications for stream modes are still pending.
Health Check
Last commit

2 months ago

Responsiveness

1+ week

Pull Requests (30d)
0
Issues (30d)
0
Star History
72 stars in the last 90 days

Explore Similar Projects

Feedback? Help us improve.