Skip to content

vishalm/surrogate-os

Repository files navigation

Surrogate OS

Surrogate OS

The AI Identity Engine

Synthesize complete AI employees from a role description.
Deploy across chat, voice, AR, and humanoid interfaces.

Build Status Tests TypeScript License MIT Docker PRs Welcome

Docs · Quick Start · Architecture · API Reference · Roadmap


What is Surrogate OS?

Surrogate OS is a platform that creates professional AI surrogates — digital employees with domain expertise, regulatory compliance, and institutional memory. Define a role (Senior ER Nurse, M&A Legal Advisor), and the system generates decision-making SOPs, enforces compliance across six frameworks (HIPAA, GDPR, EU AI Act), and learns from every interaction. Think of it as an operating system for AI workers that actually become the employee.


Key Features

Identity & Expertise

  • Surrogate Studio — Create AI professional identities from a role description
  • Persona Templates — Versioned persona library with rollback, import/export
  • SOP Generation — LLM-powered decision graphs (9 node types) with certification

Intelligence

  • Institutional Memory — Short-term and long-term memory with pattern promotion
  • Shift Debriefs — Automated session analysis with improvement recommendations
  • SOP Self-Update — Automatic improvement proposals from debrief insights
  • Org DNA — Embed organizational knowledge via pgvector for RAG retrieval

Compliance & Safety

  • 6 Regulatory Frameworks — HIPAA, GDPR, CQC, FCA, SOX, EU AI Act
  • Ed25519 SOP Signing — Cryptographic certification of decision procedures
  • Bias Auditing — LLM-powered anomaly detection and fairness analysis
  • Human-in-the-Loop — Kill switches, escalation nodes, checkpoint approvals

Operations

  • Fleet Management — Monitor all surrogates in real-time with health metrics
  • Handoff Protocol — Seamless surrogate-to-surrogate and surrogate-to-human transfers
  • Execution Engine — Live SOP traversal with decision recording

Enterprise

  • Multi-Tenant — Schema-per-tenant isolation with 25+ tables per org
  • API Keys & Webhooks — Programmatic access and event-driven integrations
  • Federated Learning — Cross-org insights with differential privacy
  • Data Portability — Full org export/import, zero lock-in

Interface

  • Surrogate Chat — Real-time conversations with your AI employees
  • Humanoid SDK — Interface abstraction from chat to fully autonomous
  • SOP Marketplace — Publish, discover, and install SOPs across organizations

Architecture

┌─────────────────────────────────────────┐
│           Interface Layer               │
│    Chat · Voice · AR · Humanoid         │
├─────────────────────────────────────────┤
│          Execution Layer                │
│   SOP Engine · Session · Decisions      │
├─────────────────────────────────────────┤
│          Intelligence Layer             │
│  Memory · Debriefs · Proposals · DNA    │
├─────────────────────────────────────────┤
│           Identity Layer                │
│  Surrogates · Personas · SOPs · Certs   │
├─────────────────────────────────────────┤
│         Infrastructure Layer            │
│ Multi-Tenant · OTEL · Prisma · pgvector │
└─────────────────────────────────────────┘

Loosely coupled — Event bus (15 event types), service registry (DI), unified LLM provider across 4 providers.


Quick Start

Get running in under 2 minutes:

git clone https://github.com/vishalm/surrogate-os.git
cd surrogate-os/infra
docker compose up -d
Service URL
Web Dashboard localhost:3000
API Server localhost:3001
Swagger Docs localhost:3001/docs
Grafana localhost:4000

Login: admin@acme.com / Password123!

Set your ANTHROPIC_API_KEY in infra/.env to enable LLM features. See the Quick Start Guide for details.


Tech Stack

Layer Technology
Frontend Next.js 15, React, Tailwind CSS
Backend Fastify, Node.js 22
Database PostgreSQL 16, Prisma ORM, pgvector
Observability OpenTelemetry, Grafana, Prometheus, Loki, Tempo
Infrastructure Docker Compose (8 containers)
Language TypeScript (strict mode)

Platform Stats

Metric Count
API Modules 25
API Endpoints ~130
Dashboard Pages 47
Tests Passing 571
Docker Services 8
Compliance Frameworks 6
LLM Providers 4

LLM Providers

Surrogate OS supports multiple LLM backends, configurable per-organization:

Provider Type
Anthropic Claude Cloud
OpenAI Cloud
Azure OpenAI Cloud
Ollama Local / Self-hosted

Switch providers in Settings > LLM Configuration without changing any surrogate definitions.


Documentation

Full documentation is available at vishalm.github.io/surrogate-os


Contributing

We welcome contributions! See CONTRIBUTING.md for development setup, coding conventions, and PR guidelines.

Looking for a place to start? Check out issues labeled good first issue.


Roadmap

See ROADMAP.md for planned features and the development timeline.


License

MIT


Star History Chart

If you find Surrogate OS useful, consider giving it a star.

About

Surrogate OS, The professional identity engine that synthesizes complete AI experts from a role input — and deploys them across chat, voice, avatar, and humanoid interfaces.

Topics

Resources

Contributing

Stars

Watchers

Forks

Packages

 
 
 

Contributors

Languages