CLI tool for interacting with Model Context Provider servers
Top 26.9% on sourcepulse
This CLI client provides a robust interface for interacting with Model Context Provider (MCP) servers, enabling seamless communication with Large Language Models (LLMs). It's designed for developers and power users who need to integrate LLM capabilities into their workflows, offering features like tool usage, conversation management, and multiple operational modes for chat, scripting, and direct command execution.
How It Works
The mcp-cli
is built upon the chuk-mcp
protocol library, which handles the core communication layer. This separation allows the CLI to focus on user experience, offering features like command completion, colorful output, and progress indicators. It supports multiple LLM providers (OpenAI, Ollama) and features a flexible tool system for automatic discovery and execution of server-provided tools, including complex multi-step tool chains.
Quick Start & Requirements
pip install -e ".[cli,dev]"
(from source) or via UV.OPENAI_API_KEY
), local Ollama installation (if using Ollama), and a server_config.json
file.Highlighted Details
Maintenance & Community
The project acknowledges contributions from Anthropic Claude, Rich, Typer, Prompt Toolkit, and CHUK-MCP. Contribution guidelines are provided.
Licensing & Compatibility
Licensed under the MIT License, permitting commercial use and integration with closed-source projects.
Limitations & Caveats
The README mentions a reserved section for future WebAssembly support, implying it is not currently available. The project relies on external LLM providers and their respective APIs/local installations.
2 weeks ago
1 day