LLM code execution sandbox using Docker containers
Top 71.7% on sourcepulse
This library provides a lightweight and portable sandbox runtime for executing LLM-generated code securely using Docker containers. It's designed for developers and researchers who need to safely run code snippets from LLMs in isolated environments, supporting multiple programming languages and offering flexibility through custom Dockerfiles and integrations with AI frameworks like Langchain and LlamaIndex.
How It Works
The core of the library is the SandboxSession
class, which manages the lifecycle of Docker containers for code execution. It leverages Docker to create isolated environments, allowing for the execution of code in various languages (Python, Java, JavaScript, C++, Go, Ruby) within a controlled setup. The approach emphasizes ease of use, portability via predefined or custom Docker images, and scalability with support for Kubernetes and remote Docker hosts.
Quick Start & Requirements
pip install llm-sandbox
or with extras: pip install llm-sandbox[kubernetes]
, pip install llm-sandbox[podman]
, pip install llm-sandbox[docker]
.kubernetes
Python package and a configured Kubernetes cluster.Highlighted Details
Maintenance & Community
CHANGELOG.md
.Licensing & Compatibility
Limitations & Caveats
The project is actively seeking contributions for enhancements such as improved resource monitoring accuracy and container pooling for performance. Specific security scanning patterns and distributed execution capabilities are listed as areas for future development.
1 week ago
1 day