Connect LLMs to MCP servers for custom agents
Top 10.1% on sourcepulse
MCP-Use provides a Python library for integrating any Large Language Model (LLM) with any "MCP" (Meta-Cognitive Process) server, enabling the creation of custom agents with tool access. It targets developers looking to build sophisticated AI agents that can leverage external tools like web browsing or file operations without relying on proprietary clients.
How It Works
MCP-Use acts as a bridge between LLMs and MCP servers. It utilizes LangChain's tool-calling capabilities to allow LLMs to select and invoke functions exposed by MCP servers. The library supports direct HTTP connections and dynamic server selection, allowing agents to manage multiple MCP server instances simultaneously and intelligently choose the appropriate server for a given task.
Quick Start & Requirements
pip install mcp-use
langchain-openai
).MCPAgent
with an LLM and an MCPClient
configured with server details (e.g., Playwright, Airbnb, Blender MCP).Highlighted Details
Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
The library requires LLMs specifically capable of tool/function calling. While it supports multiple servers, managing complex interactions between many servers might require careful configuration and potentially the use of the use_server_manager
feature.
1 day ago
1 day