Local MCP server for interactive LLM communication
Top 91.7% on SourcePulse
This project provides a local, cross-platform Model Context Protocol (MCP) server for Node.js/TypeScript, enabling direct user interaction with AI agents. It's designed for developers and power users who need their AI assistants to request input, display notifications, or initiate command-line chats on their local machine, enhancing workflows like interactive configuration or pair programming.
How It Works
The server exposes several MCP tools, including request_user_input
for prompts with optional predefined choices, message_complete_notification
for OS-level alerts, and start_intensive_chat
/ask_intensive_chat
/stop_intensive_chat
for persistent command-line sessions. This approach allows LLMs to break out of purely text-based, asynchronous communication and engage users directly on their local environment, facilitating more natural and controlled AI-assisted tasks.
Quick Start & Requirements
npx -y interactive-mcp
(add to client config).package.json
), pnpm
for development.claude_desktop_config.json
, mcp.json
(Cursor), or VS Code settings.json
.Highlighted Details
Maintenance & Community
This project is in its early stages. Contribution guidelines are basic. No community links (Discord/Slack) or roadmap are provided.
Licensing & Compatibility
MIT License. Compatible with commercial and closed-source applications.
Limitations & Caveats
The project is noted as being in its early stages. macOS recommendations suggest specific Terminal.app profile settings for managing shell windows, indicating potential complexities in window management.
2 months ago
Inactive