Discover and explore top open-source AI tools and projects—updated daily.
A TUI client for local LLMs to interact with MCP servers
Top 88.1% on SourcePulse
This project provides a text-based user interface (TUI) client for interacting with Model Context Protocol (MCP) servers via Ollama, enabling local LLMs to leverage tools. It targets developers working with local LLMs, offering a streamlined, code-free way to manage complex tool-use workflows, enhancing productivity and control.
How It Works
The client acts as a bridge between Ollama's local LLM instances and external MCP servers. It utilizes a rich terminal interface built with Rich and Prompt Toolkit, supporting multiple transport protocols (STDIO, SSE, Streamable HTTP) for server connections. Key functionalities include dynamic model switching, real-time streaming responses, and comprehensive configuration of LLM parameters and system prompts.
Quick Start & Requirements
pip install --upgrade ollmcp
) or use the one-step uvx ollmcp
. Source installation is also supported.Highlighted Details
Maintenance & Community
The project is maintained by jonigl
. No specific community channels (like Discord/Slack) or roadmaps are detailed in the README.
Licensing & Compatibility
Licensed under the permissive MIT License, allowing for commercial use and integration with closed-source applications.
Limitations & Caveats
Requires a local Ollama instance. The "thinking mode" feature is model-specific. Auto-discovery relies on a specific configuration file path, potentially limiting its universality across different user setups.
3 days ago
Inactive