Discover and explore top open-source AI tools and projects—updated daily.
concierge-hqReliability fabric for AI agents and MCP servers
Top 64.1% on SourcePulse
<2-3 sentences summarising what the project addresses and solves, the target audience, and the benefit.> Concierge addresses the complexity of exposing AI agent tools by implementing the Model Context Protocol (MCP). It provides a reliability layer for MCP servers, enabling progressive disclosure of relevant tools rather than a flat list. This approach benefits developers by guaranteeing deterministic results, reliable tool invocation, reduced LLM context window usage, and ultimately lower operational costs for AI applications.
How It Works
Concierge acts as a fabric for MCP servers, dynamically altering the tools/list response based on the current workflow step. This progressive disclosure ensures agents only see pertinent tools, simplifying interaction and reducing prompt costs. Key mechanisms include defining stages to group related tools and transitions to enforce workflow logic. It also supports server-side shared state management across steps and offers semantic search to collapse large toolsets into discoverable meta-tools, abstracting away API complexity.
Quick Start & Requirements
pip install concierge-sdk (Python 3.9+ required; uv recommended for speed).concierge init <project-name> to generate a starter project.python main.py for scaffolded projects or wrap existing MCP servers with from concierge import Concierge; app = Concierge(FastMCP("my-server")).Highlighted Details
stages and transitions, ensuring adherence to business logic.search_tools, call_tool) for agent discoverability.concierge init command generates a functional project structure, accelerating initial setup.Maintenance & Community
https://discord.gg/bfT3VkhF).Licensing & Compatibility
Limitations & Caveats
2 days ago
Inactive
microsoft