A blazingly fast, security-focused multi-language code execution engine built on a next-generation hybrid architecture.
It leverages Bun + Hono for ultra-low-latency API performance while separating execution concerns for maximum efficiency and control.
- β‘ Sub-millisecond execution β 1ms warm execution via optimized worker pools
- π Python daemon architecture β Persistent Python workers for lightning-fast execution
- π V8-Level Isolation β Sandbox JavaScript with zero access to Node.js internals
- π§ Hybrid Engine β Bun for APIs, Node.js for secure execution, Python daemons for speed
- π Infinite Scalability β Smart queuing + auto-scaling workers (2-8 workers)
- π‘οΈ Enterprise-Ready β Multi-layer sandboxing, validation, and fault tolerance
- β 320+ req/sec β Proven high-throughput performance under stress testing
| Language | Cold Start | Warm Execution | Throughput | Security |
|---|---|---|---|---|
| JavaScript | 24-35ms | 1ms | 1862 req/s | V8 Isolation |
| TypeScript | 45ms | 3-6ms | 213 req/s | V8 Isolation |
| Python | 12ms | 1ms | 320 req/s | Daemon Pool |
Benchmarks from stress testing with 20 concurrent requests
ββββββββββββββββ ββββββββββββββββββββ βββββββββββββββββββ
β Bun + Hono ββββββ Process Pool ββββββ isolated-vm β
β (API Layer) β β Auto-Scaling β β V8 Sandbox β
ββββββββββββββββ ββββββββββββββββββββ βββββββββββββββββββ
β‘ Ultra Fast π Smart Queuing π Perfect Isolation
ββββββββββββββββ ββββββββββββββββββββ βββββββββββββββββββ
β Universal ββββββ Python Daemon ββββββ Persistent β
β Runner β β Pool β β Workers β
ββββββββββββββββ ββββββββββββββββββββ βββββββββββββββββββ
π Multi-lang π Lightning Fast β‘ Sub-ms execution
- JavaScript Pool: 2β8 workers with dynamic scaling
- Python Daemon Pool: Persistent workers for zero cold starts
- Load-balanced request distribution with intelligent queuing
- Fault tolerance via auto-healing and graceful shutdowns
- Memory-efficient idle worker cleanup
- π V8-Level Sandboxing β Total isolation from
require,process,global, etc. - β‘ Sub-millisecond Performance β Warm starts in 1ms for JS/Python
- π Python Daemon Architecture β Persistent workers eliminate cold starts
- π‘οΈ Layered Security β V8, subprocess, and input validations
- π RESTful API β Minimal, production-ready endpoints
- β±οΈ Smart Timeout Handling β Queue-aware and configurable
- π Real-time Metrics β Monitor worker states and performance
- π§ͺ Full Test Coverage β 30 tests, 178 assertions, 100% pass rate
- π Zero Downtime β Graceful shutdowns and resilient queues
- π Stress Tested β 320+ req/sec with 100% success rate
| Language | Aliases | Security | Performance | Concurrency | Status |
|---|---|---|---|---|---|
| JavaScript | javascript, js |
π V8 Isolation | β‘ 1ms | 1862 req/s | β Stable |
| TypeScript | typescript, ts |
π V8 Isolation | β‘ 3-6ms | 213 req/s | β Stable |
| Python | python, py |
π Daemon Pool | β‘ 1ms | 320 req/s | β Stable |
git clone https://github.com/itisrohit/valkode.git
cd valkode
bun install
# Install worker dependencies
cd scripts && npm install && cd ..bun run src/server.tsYou'll see:
π Initializing JavaScript process pool with 2 workers...
π Initializing python daemon pool with 2 workers...
β
Process pool initialized with 2 workers
π₯ python pool warmed up
β
Python runner ready
π Valkode server listening on http://localhost:3000
GET /api/v1/healthResponse:
{
"success": true,
"data": {
"status": "ok",
"processPool": {
"totalWorkers": 2,
"busyWorkers": 0,
"idleWorkers": 2,
"pendingRequests": 0
},
"runners": {
"python": {
"available": true,
"totalWorkers": 4,
"successRate": 100
}
}
}
}GET /api/v1/languagesResponse:
{
"success": true,
"data": {
"languages": ["javascript", "js", "typescript", "ts", "python", "py"]
}
}POST /api/v1/executeRequest:
{
"code": "console.log('Hello World!');",
"language": "javascript",
"timeout": 5000
}Response:
{
"success": true,
"data": {
"success": true,
"output": "Hello World!",
"executionTime": 1
}
}curl -X POST http://localhost:3000/api/v1/execute \
-H "Content-Type: application/json" \
-d '{"code": "console.log(2 + 2);", "language": "javascript"}'curl -X POST http://localhost:3000/api/v1/execute \
-H "Content-Type: application/json" \
-d '{"code": "const msg: string = \"Hello\"; console.log(msg);", "language": "typescript"}'curl -X POST http://localhost:3000/api/v1/execute \
-H "Content-Type: application/json" \
-d '{"code": "print(\"Hello from Python daemon!\")", "language": "python"}'- Complete V8 isolation using
isolated-vm require,process,global,evalβ all blocked- No file system or network access
- Memory-limited execution contexts
- Persistent daemon workers with controlled environments
- Input validation and sanitization
- Isolated execution contexts per request
- Resource limiting and timeout controls
// JavaScript Process Pool
{
minWorkers: 2,
maxWorkers: 8,
workerIdleTimeout: 30000,
maxQueueSize: 100
}
// Python Daemon Pool
{
poolSize: 2,
warmupEnabled: true,
persistentWorkers: true
}Default timeout: 5000ms
Memory limit: 128MB per worker
Default port: 3000
src/
βββ api/ β REST endpoints (health, execute, languages)
βββ engine/ β Core execution logic & sandbox
βββ runners/ β Language-specific execution handlers
βββ types/ β TypeScript type definitions
βββ utils/ β Process pools & worker management
scripts/ β Isolated VM worker scripts
tests/ β Comprehensive test suite (30 tests)
βββ sandbox.test.ts β Core functionality tests
βββ api.test.ts β API performance tests
βββ benchmarks.test.ts β Performance benchmarks
# Run all tests
bun test
# Run CI simulation (full pipeline)
./scripts/test-ci.shTest Coverage:
- β 30 tests passing
- β 178 assertions
- β Core functionality, API performance, benchmarks
- β Stress testing (320+ req/sec throughput)
- β Concurrent execution (20 simultaneous requests)
# Monitor process pool status
watch -n 1 'curl -s http://localhost:3000/api/v1/health | jq .data.processPool'
# Monitor Python daemon status
watch -n 1 'curl -s http://localhost:3000/api/v1/health | jq .data.runners.python'# High-throughput testing
ab -n 1000 -c 20 -p post_data.json -T application/json \
http://localhost:3000/api/v1/execute- β Process pool with auto-scaling (2-8 workers)
- β V8 isolation + intelligent queuing
- β Python daemon architecture for sub-ms execution
- β Multi-language support (JS, TS, Python)
- β Comprehensive testing suite (30 tests)
- β Production-ready performance (320+ req/sec)
- π Go language support
- π WebAssembly (WASM) integration
- π Enhanced monitoring & metrics
- Rust execution via WASM
- C/C++ support via WASM
- Advanced resource limiting
- WebSocket support for real-time execution
- Multi-file & package execution
- Redis-powered distributed worker pools
- Horizontal scaling capabilities
- Complete WASM ecosystem (Ruby, Java via GraalVM)
- Plugin architecture for custom languages
- Advanced security sandboxing
- Enterprise deployment tools
git checkout -b feature/your-feature
bun test # Ensure all tests pass
./scripts/test-ci.sh # Run full CI simulation
git commit -m "feat: add awesome feature"
git push origin feature/your-featureContribution Guidelines:
- π― Maintain test coverage (currently 30 tests, 178 assertions)
- π§ͺ Security-first development
- π§ Smart worker pool behavior
- π Performance regression testing
- π Update documentation and benchmarks
- Bun β Ultra-fast JavaScript runtime
- isolated-vm β V8 isolation
- Hono β Fast web framework
- Open Source Security Community
Built with β€οΈ by Rohit Kumar
β Star this repo if you liked it!