The secure, lightweight open-source AI agent framework.
Quick Start · Why OpenCrust? · Features · Security · Architecture · Migrate from OpenClaw · Contributing
A single 16 MB binary that runs your AI agents across Telegram, Discord, Slack, WhatsApp, and iMessage - with encrypted credential storage, config hot-reload, and 13 MB of RAM at idle. Built in Rust for the security and reliability that AI agents demand.
# Install (Linux, macOS)
curl -fsSL https://raw.githubusercontent.com/opencrust-org/opencrust/main/install.sh | sh
# Interactive setup - pick your LLM provider and channels
opencrust init
# Start - on first message, the agent will introduce itself and learn your preferences
opencrust startBuild from source
# Requires Rust 1.85+
cargo build --release
./target/release/opencrust init
./target/release/opencrust start
# Optional: include WASM plugin support
cargo build --release --features pluginsPre-compiled binaries for Linux (x86_64, aarch64), macOS (Intel, Apple Silicon), and Windows (x86_64) are available on GitHub Releases.
| OpenCrust | OpenClaw (Node.js) | ZeroClaw (Rust) | |
|---|---|---|---|
| Binary size | 16 MB | ~1.2 GB (with node_modules) | ~25 MB |
| Memory at idle | 13 MB | ~388 MB | ~20 MB |
| Cold start | 3 ms | 13.9 s | ~50 ms |
| Credential storage | AES-256-GCM encrypted vault | Plaintext config file | Plaintext config file |
| Auth default | Enabled (WebSocket pairing) | Disabled by default | Disabled by default |
| Scheduling | Cron, interval, one-shot | Yes | No |
| Multi-agent routing | Planned (#108) | Yes (agentId) | No |
| Session orchestration | Planned (#108) | Yes | No |
| MCP support | Stdio | Stdio + HTTP | Stdio |
| Channels | 5 | 6+ | 4 |
| LLM providers | 14 | 10+ | 22+ |
| Pre-compiled binaries | Yes | N/A (Node.js) | Build from source |
| Config hot-reload | Yes | No | No |
| WASM plugin system | Optional (sandboxed) | No | No |
| Self-update | Yes (opencrust update) |
npm | Build from source |
Benchmarks measured on a 1 vCPU, 1 GB RAM DigitalOcean droplet. Reproduce them yourself.
OpenCrust is built for the security requirements of always-on AI agents that access private data and communicate externally.
- Encrypted credential vault - API keys and tokens stored with AES-256-GCM encryption at
~/.opencrust/credentials/vault.json. Never plaintext on disk. - Authentication by default - WebSocket gateway requires pairing codes. No unauthenticated access out of the box.
- User allowlists - per-channel allowlists control who can interact with the agent. Unauthorized messages are silently dropped.
- Prompt injection detection - input validation and sanitization before content reaches the LLM.
- WASM sandboxing - optional plugin sandbox via WebAssembly runtime with controlled host access (compile with
--features plugins). - Localhost-only binding - gateway binds to
127.0.0.1by default, not0.0.0.0.
Native providers:
- Anthropic Claude - streaming (SSE), tool use
- OpenAI - GPT-4o, Azure, any OpenAI-compatible endpoint via
base_url - Ollama - local models with streaming
OpenAI-compatible providers:
- Sansa - regional LLM via sansaml.com
- DeepSeek - DeepSeek Chat
- Mistral - Mistral Large
- Gemini - Google Gemini via OpenAI-compatible API
- Falcon - TII Falcon 180B (AI71)
- Jais - Core42 Jais 70B
- Qwen - Alibaba Qwen Plus
- Yi - 01.AI Yi Large
- Cohere - Command R Plus
- MiniMax - MiniMax Text 01
- Moonshot - Kimi K2
- Telegram - streaming responses, MarkdownV2, bot commands, typing indicators, user allowlist with pairing codes, photo/vision support, voice messages (Whisper STT), document/file handling
- Discord - slash commands, event-driven message handling, session management
- Slack - Socket Mode, streaming responses, allowlist/pairing
- WhatsApp - Meta Cloud API webhooks, allowlist/pairing
- iMessage - macOS native via chat.db polling, group chats, AppleScript sending (setup guide)
- Connect any MCP-compatible server (filesystem, GitHub, databases, web search)
- Tools appear as native agent tools with namespaced names (
server.tool) - Configure in
config.ymlor~/.opencrust/mcp.json(Claude Desktop compatible) - CLI:
opencrust mcp list,opencrust mcp inspect <name>
- On first message, the agent introduces itself and asks a few questions to learn your preferences
- Writes
~/.opencrust/dna.mdwith your name, communication style, guidelines, and the bot's own identity - No config files to edit, no wizard sections to fill out - just a conversation
- Hot-reloads on edit - change
dna.mdand the agent adapts immediately - Migrating from OpenClaw?
opencrust migrate openclawimports your existingSOUL.md
- Tool execution loop - bash, file_read, file_write, web_fetch, web_search, schedule_heartbeat (up to 10 iterations)
- SQLite-backed conversation memory with vector search (sqlite-vec + Cohere embeddings)
- Context window management - rolling conversation summarization at 75% context window
- Scheduled tasks - cron, interval, and one-shot scheduling
- Define agent skills as Markdown files (SKILL.md) with YAML frontmatter
- Auto-discovery from
~/.opencrust/skills/- injected into the system prompt - CLI:
opencrust skill list,opencrust skill install <url>,opencrust skill remove <name>
- Config hot-reload - edit
config.yml, changes apply without restart - Daemonization -
opencrust start --daemonwith PID management - Self-update -
opencrust updatedownloads the latest release with SHA-256 verification,opencrust rollbackto revert - Restart -
opencrust restartgracefully stops and starts the daemon - Runtime provider switching - add or switch LLM providers via the webchat UI or REST API without restarting
- Migration tool -
opencrust migrate openclawimports skills, channels, and credentials - Conversation summarization - rolling summary at 75% context window, session summaries persisted across restarts
- Interactive setup -
opencrust initwizard for provider and channel configuration
One command imports your skills, channel configs, credentials (encrypted into the vault), and personality (SOUL.md as dna.md):
opencrust migrate openclawUse --dry-run to preview changes before committing. Use --source /path/to/openclaw to specify a custom OpenClaw config directory.
OpenCrust looks for config at ~/.opencrust/config.yml:
gateway:
host: "127.0.0.1"
port: 3888
llm:
claude:
provider: anthropic
model: claude-sonnet-4-5-20250929
# api_key resolved from: vault > config > ANTHROPIC_API_KEY env var
ollama-local:
provider: ollama
model: llama3.1
base_url: "http://localhost:11434"
channels:
telegram:
type: telegram
enabled: true
bot_token: "your-bot-token" # or TELEGRAM_BOT_TOKEN env var
agent:
# Personality is configured via ~/.opencrust/dna.md (auto-created on first message)
max_tokens: 4096
max_context_tokens: 100000
memory:
enabled: true
# MCP servers for external tools
mcp:
filesystem:
command: npx
args: ["-y", "@modelcontextprotocol/server-filesystem", "/tmp"]See the full configuration reference for all options including Discord, Slack, WhatsApp, iMessage, embeddings, and MCP server setup.
crates/
opencrust-cli/ # CLI, init wizard, daemon management
opencrust-gateway/ # WebSocket gateway, HTTP API, sessions
opencrust-config/ # YAML/TOML loading, hot-reload, MCP config
opencrust-channels/ # Discord, Telegram, Slack, WhatsApp, iMessage
opencrust-agents/ # LLM providers, tools, MCP client, agent runtime
opencrust-db/ # SQLite memory, vector search (sqlite-vec)
opencrust-plugins/ # WASM plugin sandbox (wasmtime)
opencrust-media/ # Media processing (scaffolded)
opencrust-security/ # Credential vault, allowlists, pairing, validation
opencrust-skills/ # SKILL.md parser, scanner, installer
opencrust-common/ # Shared types, errors, utilities
| Component | Status |
|---|---|
| Gateway (WebSocket, HTTP, sessions) | Working |
| Telegram (streaming, commands, pairing, photos, voice, documents) | Working |
| Discord (slash commands, sessions) | Working |
| Slack (Socket Mode, streaming) | Working |
| WhatsApp (webhooks) | Working |
| iMessage (macOS, group chats) | Working |
| LLM providers (14: Anthropic, OpenAI, Ollama + 11 OpenAI-compatible) | Working |
| Agent tools (bash, file_read, file_write, web_fetch, web_search, schedule_heartbeat) | Working |
| MCP client (stdio, tool bridging) | Working |
| Skills (SKILL.md, auto-discovery) | Working |
| Config (YAML/TOML, hot-reload) | Working |
| Personality (DNA bootstrap, hot-reload) | Working |
| Memory (SQLite, vector search, summarization) | Working |
| Security (vault, allowlist, pairing) | Working |
| Scheduling (cron, interval, one-shot) | Working |
| CLI (init, start/stop/restart, update, migrate, mcp, skills) | Working |
| Plugin system (WASM sandbox) | Scaffolded |
| Media processing | Scaffolded |
OpenCrust is open source under the MIT license. Join the Discord to chat with contributors, ask questions, or share what you're building. See CONTRIBUTING.md for setup instructions, code guidelines, and the crate overview.
| Priority | Issue | Description |
|---|---|---|
| P0 | #103 | README and positioning |
| P0 | #104 | Website: opencrust.org |
| P0 | #105 | Discord community |
| P1 | #106 | Built-in starter skills |
| P1 | #107 | Scheduling hardening |
| P1 | #108 | Multi-agent routing |
| P1 | #109 | Install script |
| P1 | #110 | Linux aarch64 + Windows releases |
| P1 | #80 | MCP: HTTP transport, resources, prompts |
Browse all open issues or filter by good-first-issue to find a place to start.
MIT
