Skip to content

instavm/coderunner

Repository files navigation

coderunner banner

⚡ CodeRunner: Secure Code Execution Sandbox

CodeRunner provides a secure MCP (Model Context Protocol) code execution server that runs inside a sandboxed environment on your Mac, powered by Apple's native containers. It allows you to safely execute code generated by AI models like Claude, OpenAI GPT, or Ollama.

Leverage powerful remote LLMs (like ChatGPT or Claude Sonnet 4) to work with your local files—such as videos—securely on your Mac. The LLM runs in a sandboxed environment where it can install external libraries, generate code, and execute it locally without uploading your data to the cloud.

With CodeRunner, you can achieve the following examples and more:

  • trim and combine video segments using ffmpeg without manual installation.
  • generate prime numbers simply
  • visualize cryptocurrency trends by generating a chart for the last four days of ETH prices using matplotlib, all within a secure, sandboxed environment.

This guide shows you how to use the pre-built CodeRunner sandbox.

🚀 Quick Start

Prerequisites

Step 1: Set Up Local Network

Run these commands once to configure the .local domain:

sudo container system dns create local
container system dns default set local

Step 2: Start the Sandbox Container

This command downloads and runs the CodeRunner sandbox from Docker Hub:

# Run the container and make it available at http://coderunner.local:8222/sse
container run \
  --name coderunner \
  --detach --rm  --cpus 8 --memory 4g \
  instavm/coderunner

MCP server will be available at:

http://coderunner.local:8222/sse

Step 3: Clone the Repository

git clone https://github.com/BandarLabs/coderunner.git
cd coderunner

Step 4: Install required packages

pip install -r examples/requirements.txt

🔌 Integration Options

Option 1: Claude Desktop Integration

Configure Claude Desktop to use CodeRunner as an MCP server:

demo1 demo2 demo4

  1. Copy the example configuration:

    cd examples
    cp claude_desktop/claude_desktop_config.example.json claude_desktop/claude_desktop_config.json
  2. Edit the configuration file and replace the placeholder paths:

    • Replace /path/to/your/python with your actual Python path (e.g., /usr/bin/python3 or /opt/homebrew/bin/python3)
    • Replace /path/to/coderunner with the actual path to your cloned repository

    Example after editing:

    {
      "mcpServers": {
        "coderunner": {
          "command": "/opt/homebrew/bin/python3",
          "args": ["/Users/yourname/coderunner/examples/claude_desktop/mcpproxy.py"]
        }
      }
    }
  3. Update Claude Desktop configuration:

    • Open Claude Desktop
    • Go to Settings → Developer
    • Add the MCP server configuration
    • Restart Claude Desktop
  4. Start using CodeRunner in Claude: You can now ask Claude to execute code, and it will run safely in the sandbox!

Option 2: Python OpenAI Agents

Use CodeRunner with OpenAI's Python agents library:

demo3

  1. Set your OpenAI API key:

    export OPENAI_API_KEY="your-openai-api-key-here"
  2. Run the client:

    python examples/openai_agents/openai_client.py
  3. Start coding: Enter prompts like "write python code to generate 100 prime numbers" and watch it execute safely in the sandbox!

🛡️ Security Features

Sandboxed Execution

All code runs in an isolated container environment, ensuring your host system remains secure.

From @apple/container documentation -

Security: Each container has the isolation properties of a full VM, using a minimal set of core utilities and dynamic libraries to reduce resource utilization and attack surface.

🏗️ Architecture

CodeRunner consists of:

  • Sandbox Container: Isolated execution environment with Jupyter kernel
  • MCP Server: Handles communication between AI models and the sandbox
  • Proxy Layer: Manages connections and security

📚 Examples

The examples/ directory contains:

  • openai-agents - Example OpenAI agents integration
  • claude-desktop - Example Claude Desktop integration

🤝 Contributing

We welcome contributions! Please see CONTRIBUTING.md for guidelines.

📄 License

This project is licensed under the MIT License - see the LICENSE file for details.