Desktop app for LLM function calling
Top 28.7% on sourcepulse
Dive is an open-source desktop application that acts as a Model Context Protocol (MCP) host, enabling seamless integration with any Large Language Model (LLM) that supports function calling. It targets developers and power users looking to build sophisticated AI agents that can interact with external tools and services, offering universal LLM compatibility and cross-platform support.
How It Works
Dive leverages the Model Context Protocol (MCP) to facilitate communication between LLMs and external tools. It acts as a host, managing the execution of functions called by the LLM and returning results. The application supports both standard input/output (stdio) and Server-Sent Events (SSE) for MCP integration, allowing flexibility in how LLMs connect and interact with the platform.
Quick Start & Requirements
npx uvx
).yt-dlp-mcp
requires yt-dlp
to be installed separately (e.g., pip install yt-dlp
).fetch
, filesystem
, and youtubedl
can be added via JSON.Highlighted Details
Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
1 day ago
1 day