Discover and explore top open-source AI tools and projects—updated daily.
Secure local sandbox for LLM-generated code execution
Top 62.2% on SourcePulse
CodeRunner provides a secure, sandboxed environment for executing AI-generated code locally on macOS, specifically designed for Apple Silicon Macs. It enables users to process local files with remote LLMs like Claude or ChatGPT without uploading sensitive data, by running the LLM-generated code within isolated Apple containers.
How It Works
CodeRunner leverages Apple's native container technology, offering VM-level isolation for code execution. This approach ensures that code runs in a secure, minimal environment with reduced resource utilization and attack surface. An MCP (Model Context Protocol) server facilitates communication between AI models and the sandboxed execution environment, allowing seamless integration with tools like Claude Desktop and OpenAI agents.
Quick Start & Requirements
install.sh
executable, and run sudo ./install.sh
.pip install -r examples/requirements.txt
.Highlighted Details
Maintenance & Community
The project is actively maintained by instavm. Contributions are welcomed.
Licensing & Compatibility
Licensed under the Apache 2.0 License. This license is permissive and generally compatible with commercial and closed-source applications.
Limitations & Caveats
Currently limited to macOS with Apple Silicon hardware. The integration examples require specific configuration steps for each supported AI tool.
2 weeks ago
Inactive