CodeRunner provides a MCP server running inside a secure sandbox, powered by Apple's native container technology, allowing you to safely execute code generated by local/remote AI models like Ollama.
This guide is for using the pre-built CodeRunner sandbox.
- A Mac with Apple Silicon (M1/M2/M3/M4 series).
- Apple
containerTool installed via Download - Python 3.10+
Run these commands once to set up the .local top-level domain:
sudo container system dns create local
container system dns default set localThis single command will download the CodeRunner sandbox image from Docker Hub (if not already present) and run it.
# This will run the container named 'coderunner' and make it
# available at http://coderunner.local:8222
container run \
--name coderunner \
--detach --rm \
instavm/coderunnerFinally, run the script from your terminal:
git clone https://github.com/BandarLabs/coderunner.git
cd coderunner
# Configure Claude Desktop MCP integration (optional)
cp claude_mcp_proxy/claude_desktop_config.example.json claude_mcp_proxy/claude_desktop_config.json
# Edit the config file with your local pathsNow you can give it prompts like write python code to generate 100 primes and watch it execute the code safely in the sandbox!
Download mcphost from releases
cp examples/.mcp.json ~/.mcp.json
~/Downloads/mcphost_Darwin_arm64/mcphost -m ollama:llama3.1:8bpython examples/openai_client.pycurl -X POST "http://coderunner.local:8222/execute/" -H "Content-Type: application/json" -d '{"command": "print(100**100)"}'
{"result":"100000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000\n"}
