mcp-llm-bridge  by bartolli

LLM bridge for Model Context Protocol (MCP) servers

created 8 months ago
337 stars

Top 82.8% on sourcepulse

GitHubView on GitHub
Project Summary

This project provides a bridge between Model Context Protocol (MCP) servers and OpenAI-compatible Large Language Models (LLMs). It enables LLMs to interact with MCP-compliant tools by translating between MCP's tool specifications and OpenAI's function-calling interface, allowing seamless integration of local or cloud-based LLMs with MCP-enabled systems.

How It Works

The bridge acts as a bidirectional translation layer. It converts MCP tool specifications into OpenAI function schemas, facilitating LLM understanding of available tools. When the LLM invokes a function, the bridge maps this invocation back to an MCP tool execution, processing the results and returning them to the LLM. This approach standardizes tool interaction, making any OpenAI-compatible LLM capable of leveraging MCP tools.

Quick Start & Requirements

  • Install uv for package management: curl -LsSf https://astral.sh/uv/install.sh | sh
  • Clone the repository: git clone https://github.com/bartolli/mcp-llm-bridge.git
  • Navigate and set up a virtual environment: cd mcp-llm-bridge, uv venv, source .venv/bin/activate
  • Install the package: uv pip install -e .
  • Create a test database: python -m mcp_llm_bridge.create_test_db
  • Requires Python 3.x.
  • Supports OpenAI API and local endpoints implementing the OpenAI API specification (e.g., Ollama, LM Studio).
  • Official documentation: Resources

Highlighted Details

  • Bidirectional protocol translation between MCP and OpenAI function-calling.
  • Supports OpenAI API (e.g., gpt-4o) and local endpoints like Ollama.
  • Converts MCP tool specs to OpenAI function schemas and vice-versa.
  • Example usage with SQLite database and uvx command.

Maintenance & Community

  • Open to PRs.
  • No specific community channels or roadmap links provided in the README.

Licensing & Compatibility

  • MIT License.
  • Permissive license suitable for commercial use and integration with closed-source applications.

Limitations & Caveats

The README notes that mistral-nemo:12b-instruct-2407-q8_0 handles complex queries more effectively than other tested models like llama3.2:3b-instruct-fp16. LM Studio compatibility is stated as a likely success but not explicitly tested.

Health Check
Last commit

4 months ago

Responsiveness

1+ week

Pull Requests (30d)
1
Issues (30d)
0
Star History
38 stars in the last 90 days

Explore Similar Projects

Feedback? Help us improve.