|
1 | 1 |  |
2 | 2 |
|
3 | | -# ⚡ CodeRunner: A sandbox running on apple container for code execution |
4 | | - |
5 | | -CodeRunner provides a MCP server running inside a secure sandbox, powered by Apple's native [container](https://github.com/apple/container) technology, allowing you to safely execute code generated by local/remote AI models like Ollama. |
6 | | - |
7 | | -This guide is for **using** the pre-built CodeRunner sandbox. |
| 3 | +# ⚡ CodeRunner: Secure Code Execution Sandbox |
8 | 4 |
|
| 5 | +CodeRunner provides a secure MCP (Model Context Protocol) server that runs inside a sandboxed environment, powered by Apple's native [container](https://github.com/apple/container) technology. It allows you to safely execute code generated by AI models like Claude, OpenAI GPT, or Ollama. |
9 | 6 |
|
| 7 | +This guide shows you how to use the pre-built CodeRunner sandbox. |
10 | 8 |
|
11 | 9 | ## 🚀 Quick Start |
12 | 10 |
|
13 | 11 | ### Prerequisites |
14 | 12 |
|
15 | | -* A Mac with Apple Silicon (M1/M2/M3/M4 series). |
16 | | -* **[Apple `container` Tool](https://github.com/apple/container)** installed via **[Download](https://github.com/apple/container/releases/download/0.1.0/container-0.1.0-installer-signed.pkg)** |
17 | | -* **Python 3.10+** |
| 13 | +- Mac with Apple Silicon (M1/M2/M3/M4 series) |
| 14 | +- **[Apple `container` Tool](https://github.com/apple/container)** - [Download installer](https://github.com/apple/container/releases/download/0.1.0/container-0.1.0-installer-signed.pkg) |
| 15 | +- **Python 3.10+** |
18 | 16 |
|
19 | | -### Step 1: Configure Local Network for Sandbox |
| 17 | +### Step 1: Set Up Local Network |
20 | 18 |
|
21 | | - |
22 | | -Run these commands once to set up the `.local` top-level domain: |
| 19 | +Run these commands once to configure the `.local` domain: |
23 | 20 |
|
24 | 21 | ```bash |
25 | 22 | sudo container system dns create local |
26 | 23 | container system dns default set local |
27 | 24 | ``` |
28 | 25 |
|
29 | | -### Step 2: Run the Pre-Built Sandbox Container |
| 26 | +### Step 2: Start the Sandbox Container |
30 | 27 |
|
31 | | -This single command will download the CodeRunner sandbox image from Docker Hub (if not already present) and run it. |
| 28 | +This command downloads and runs the CodeRunner sandbox from Docker Hub: |
32 | 29 |
|
33 | 30 | ```bash |
34 | | -# This will run the container named 'coderunner' and make it |
35 | | -# available at http://coderunner.local:8222 |
| 31 | +# Run the container and make it available at http://coderunner.local:8222 |
36 | 32 | container run \ |
37 | 33 | --name coderunner \ |
38 | 34 | --detach --rm \ |
39 | 35 | instavm/coderunner |
40 | 36 | ``` |
41 | 37 |
|
42 | | - |
43 | | -### Step 3: Run an AI Task |
44 | | - |
45 | | -Finally, run the script from your terminal: |
| 38 | +### Step 3: Clone the Repository |
46 | 39 |
|
47 | 40 | ```bash |
48 | 41 | git clone https://github.com/BandarLabs/coderunner.git |
49 | 42 | cd coderunner |
50 | | - |
51 | | -# Configure Claude Desktop MCP integration (optional) |
52 | | -cp claude_mcp_proxy/claude_desktop_config.example.json claude_mcp_proxy/claude_desktop_config.json |
53 | | -# Edit the config file with your local paths |
54 | 43 | ``` |
55 | 44 |
|
56 | | -Now you can give it prompts like `write python code to generate 100 primes` and watch it execute the code safely in the sandbox! |
| 45 | +## 🔌 Integration Options |
57 | 46 |
|
58 | | -### Use with Ollama3.1 with MCPHOST |
| 47 | +### Option 1: Claude Desktop Integration |
59 | 48 |
|
60 | | -Download `mcphost` from [releases](https://github.com/mark3labs/mcphost/releases/tag/v0.14.0) |
| 49 | +Configure Claude Desktop to use CodeRunner as an MCP server: |
61 | 50 |
|
62 | | -```bash |
63 | | -cp examples/.mcp.json ~/.mcp.json |
64 | | -~/Downloads/mcphost_Darwin_arm64/mcphost -m ollama:llama3.1:8b |
65 | | -``` |
| 51 | +1. **Copy the example configuration:** |
| 52 | + ```bash |
| 53 | + cp claude_mcp_proxy/claude_desktop_config.example.json claude_mcp_proxy/claude_desktop_config.json |
| 54 | + ``` |
66 | 55 |
|
67 | | -### Can also run via python openai agents |
| 56 | +2. **Edit the configuration file** and replace the placeholder paths: |
| 57 | + - Replace `/path/to/your/python` with your actual Python path (e.g., `/usr/bin/python3` or `/opt/homebrew/bin/python3`) |
| 58 | + - Replace `/path/to/coderunner` with the actual path to your cloned repository |
| 59 | + |
| 60 | + Example after editing: |
| 61 | + ```json |
| 62 | + { |
| 63 | + "mcpServers": { |
| 64 | + "coderunner": { |
| 65 | + "command": "/opt/homebrew/bin/python3", |
| 66 | + "args": ["/Users/yourname/coderunner/claude_mcp_proxy/mcp.py"] |
| 67 | + } |
| 68 | + } |
| 69 | + } |
| 70 | + ``` |
68 | 71 |
|
69 | | -```bash |
70 | | -python examples/openai_client.py |
71 | | -``` |
| 72 | +3. **Update Claude Desktop configuration:** |
| 73 | + - Open Claude Desktop |
| 74 | + - Go to Settings → Developer |
| 75 | + - Add the MCP server configuration |
| 76 | + - Restart Claude Desktop |
72 | 77 |
|
73 | | -### Use via Curl |
| 78 | +4. **Start using CodeRunner in Claude:** |
| 79 | + You can now ask Claude to execute code, and it will run safely in the sandbox! |
74 | 80 |
|
75 | | -``` |
76 | | -curl -X POST "http://coderunner.local:8222/execute/" -H "Content-Type: application/json" -d '{"command": "print(100**100)"}' |
| 81 | +### Option 2: Python OpenAI Agents |
77 | 82 |
|
78 | | -{"result":"100000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000\n"} |
79 | | -``` |
| 83 | +Use CodeRunner with OpenAI's Python agents library: |
| 84 | + |
| 85 | +1. **Set your OpenAI API key:** |
| 86 | + ```bash |
| 87 | + export OPENAI_API_KEY="your-openai-api-key-here" |
| 88 | + ``` |
| 89 | + |
| 90 | +2. **Install required dependencies:** |
| 91 | + ```bash |
| 92 | + pip install openai-agents |
| 93 | + ``` |
| 94 | + |
| 95 | +3. **Run the client:** |
| 96 | + ```bash |
| 97 | + python examples/openai_client.py |
| 98 | + ``` |
| 99 | + |
| 100 | +4. **Start coding:** |
| 101 | + Enter prompts like "write python code to generate 100 prime numbers" and watch it execute safely in the sandbox! |
| 102 | + |
| 103 | +## 🛡️ Security Features |
| 104 | + |
| 105 | +### Sandboxed Execution |
| 106 | +All code runs in an isolated container environment, ensuring your host system remains secure. |
| 107 | + |
| 108 | +## 🏗️ Architecture |
| 109 | + |
| 110 | +CodeRunner consists of: |
| 111 | +- **Sandbox Container:** Isolated execution environment with Jupyter kernel |
| 112 | +- **MCP Server:** Handles communication between AI models and the sandbox |
| 113 | +- **Proxy Layer:** Manages connections and security |
| 114 | + |
| 115 | +## 📚 Examples |
| 116 | + |
| 117 | +The `examples/` directory contains: |
| 118 | +- `openai_client.py` - Example OpenAI agents integration |
| 119 | +- `systemprompt.txt` - Sample system prompt for AI models |
| 120 | + |
| 121 | +## 🤝 Contributing |
| 122 | + |
| 123 | +We welcome contributions! Please see [CONTRIBUTING.md](CONTRIBUTING.md) for guidelines. |
| 124 | + |
| 125 | +## 📄 License |
| 126 | + |
| 127 | +This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details. |
0 commit comments