Orchestrate AI agents with Docker Compose
Top 86.9% on sourcepulse
This repository provides a collection of Docker Compose configurations for building and running AI agents, simplifying the orchestration of LLMs, tools, and agent runtimes. It targets developers and researchers looking to experiment with and deploy multi-agent systems using readily available open-source components and Docker.
How It Works
The project leverages Docker Compose to define and manage multi-container applications for AI agents. Each demo is a self-contained project with a compose.yaml
file, allowing users to easily spin up complex agent setups. It supports both local model execution via Docker Model Runner (DMR) and integration with cloud-based LLMs like OpenAI, offering flexibility in deployment and cost management.
Quick Start & Requirements
docker compose up --build
within a demo directory.secret.openai-api-key
file.Highlighted Details
Maintenance & Community
No specific contributors, sponsorships, or community links (Discord/Slack) are mentioned in the README.
Licensing & Compatibility
Dual-licensed under Apache License 2.0 or MIT License. Individual examples may have their own licenses that must be respected. Compatible with commercial use under the terms of either license.
Limitations & Caveats
The README does not detail specific limitations or known issues. GPU support is highly recommended for local model execution, and users must ensure their Docker environment meets the specified requirements.
3 days ago
Inactive