docker-mcp  by QuantGeekDev

MCP server for Docker operations

created 8 months ago
354 stars

Top 79.9% on sourcepulse

GitHubView on GitHub
Project Summary

This project provides a Docker Model Context Protocol (MCP) server, enabling users to manage Docker containers and Compose stacks via Claude AI. It targets developers and researchers who need to interact with Docker environments programmatically, offering a convenient way to orchestrate containerized applications using natural language prompts.

How It Works

The server exposes a set of tools that interface with the Docker Engine API. It leverages the Model Context Protocol (MCP) to receive instructions and return results, allowing for AI-driven Docker operations. Key functionalities include container creation, Docker Compose stack deployment, log retrieval, and container status monitoring.

Quick Start & Requirements

Highlighted Details

  • AI-powered Docker container and Compose stack management.
  • Supports container creation, deployment, log retrieval, and status monitoring.
  • Integrates with Claude AI via the Model Context Protocol.
  • Offers tools for debugging with the MCP Inspector.

Maintenance & Community

  • Core contributors: Alex Andru (@QuantGeekDev), Ali Sadykov (@md-archive).
  • Open for contributions via pull requests.

Licensing & Compatibility

  • MIT License. Permissive for commercial use and closed-source linking.

Limitations & Caveats

Current limitations include the absence of built-in support for environment variables, volume management, network management, container health checks, and resource limits.

Health Check
Last commit

7 months ago

Responsiveness

Inactive

Pull Requests (30d)
0
Issues (30d)
0
Star History
100 stars in the last 90 days

Explore Similar Projects

Starred by Elie Bursztein Elie Bursztein(Cybersecurity Lead at Google DeepMind), Tim J. Baek Tim J. Baek(Founder of Open WebUI), and
1 more.

harbor by av

0.3%
2k
CLI tool for local LLM stack orchestration
created 1 year ago
updated 3 weeks ago
Feedback? Help us improve.