Cross-platform AI chat client implementing the Model Context Protocol (MCP)
Top 23.5% on sourcepulse
ChatMCP is a cross-platform AI chat client designed to interact with Large Language Models (LLMs) via the Model Context Protocol (MCP). It aims to provide a unified interface for users to chat with various LLMs, manage chat history, and utilize features like SSE transport and theme switching.
How It Works
ChatMCP implements the Model Context Protocol (MCP) to abstract LLM interactions. This allows it to connect to different LLM providers (OpenAI, Claude, Ollama, DeepSeek) through a common interface. The client supports Server-Sent Events (SSE) for real-time responses and automatically selects available MCP servers.
Quick Start & Requirements
libsqlite3-0
, libsqlite3-dev
.uvx
or npx
installation of Node).mcp_server.json
.flutter pub get
and flutter run
.Highlighted Details
Maintenance & Community
The project welcomes contributions for new features and bug fixes via its Issues tracker.
Licensing & Compatibility
Limitations & Caveats
The project is actively under development, with several features marked as planned or incomplete, such as RAG and an MCP Server Market. The Linux installation requires manual installation of specific development libraries.
1 week ago
1 day