Skip to content

instavm/coderunner

Repository files navigation

Demo

⚡ CodeRunner: A sandbox running on apple container for code execution

CodeRunner provides a MCP server running inside a secure sandbox, powered by Apple's native container technology, allowing you to safely execute code generated by local/remote AI models like Ollama.

This guide is for using the pre-built CodeRunner sandbox.

🚀 Quick Start

Prerequisites

Step 1: Configure Local Network for Sandbox

Run these commands once to set up the .local top-level domain:

sudo container system dns create local
container system dns default set local

Step 2: Run the Pre-Built Sandbox Container

This single command will download the CodeRunner sandbox image from Docker Hub (if not already present) and run it.

# This will run the container named 'coderunner' and make it
# available at http://coderunner.local:8222
container run \
  --name coderunner \
  --detach --rm \
  instavm/coderunner

Step 3: Run an AI Task

Finally, run the script from your terminal:

git clone https://github.com/instavm/coderunner.git
cd coderunner

Now you can give it prompts like write python code to generate 100 primes and watch it execute the code safely in the sandbox!

Use with Ollama3.1 with MCPHOST

Download mcphost from releases

cp cookbooks/.mcp.json ~/.mcp.json
~/Downloads/mcphost_Darwin_arm64/mcphost  -m ollama:llama3.1:8b

Can also run via python openai agents

python cookbooks/openai_testmcp.py

Use via Curl

curl -X POST "http://coderunner.local:8222/execute/"  -H "Content-Type: application/json" -d '{"command": "print(100**100)"}'

{"result":"100000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000\n"}

Sponsor this project

 

Contributors 3

  •  
  •  
  •