Skip to content

frumu-ai/tandem

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1,206 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
Tandem Logo

Website CI Registry Publish Latest Release npm client PyPI client License: MIT Sponsor

Tandem

Tandem is an engine-owned workflow runtime for coordinated autonomous work.

While the current landscape of AI agents is flooded with "chat-first assistants," these conversational routing models inevitably fail at scale due to context bloat and concurrency blindness. Chat is fine as an interface, but it is weak as an authoritative coordination substrate for parallel, durable engineering workflows.

Tandem takes a fundamentally different approach to tackle the complex realities of agentic engineering. We treat autonomous execution as a distributed systems problem, prioritizing robust engine state over fragile chat transcripts.

It provides durable coordination primitives—blackboards, workboards, explicit task claiming, operational memory accumulation, and checkpoints—allowing multiple agents to work concurrently on complex, long-running software engineering and automation tasks without colliding.

  • Multiple clients, one engine: The desktop app, TUI, and headless APIs all operate on the exact same materialized state truth.
  • Engine-owned orchestration: Shared task state, replay, approvals, and deterministic workflow projections natively solve coordination failures.
  • Provider agnostic: Use OpenRouter, Anthropic, OpenAI, OpenCode Zen, or local Ollama endpoints effortlessly.

Durable State → Workboards → Agent Swarm → Artifacts

Download desktop app · Deploy on a VPS (5 min) · Read the docs

Language options

30-second quickstart

Desktop

  1. Download and launch Tandem: tandem.frumu.ai
  2. Open Settings and add a provider API key.
  3. Select a workspace folder.
  4. Start with a task prompt and choose Immediate or Plan Mode.

Headless (server/VPS)

Option 1: Quick Start (Run instantly) Run the pre-built control panel directly (it automatically downloads and starts the engine):

npx @frumu/tandem-panel

Option 2: Hackable / Service Install (Modifiable source) Clone the repo to get the control panel source code, which allows you to modify it and install it as a background systemd service:

git clone https://github.com/frumu-ai/tandem.git
cd tandem/examples/agent-quickstart
sudo bash setup-agent.sh

Open the printed URL and sign in with the generated key!

Architecture

graph TD
    %% Clients
    Desktop[Desktop App]
    ControlPanel[Web Control Panel]
    TUI[Terminal UI]
    API[SDKs & API Clients]

    subgraph "Tandem Engine (Source of Truth)"
        Orchestrator[Orchestration & Approvals]
        Blackboard[(Blackboard & Shared State)]
        Memory[(Vector Memory & Checkpoints)]
        Worktrees[Git Worktree Isolation]
    end

    subgraph "Agent Swarm"
        Planner[Planner Agent]
        Builder[Builder Agent]
        Validator[Verifier Agent]
    end

    Desktop -.-> Orchestrator
    ControlPanel -.-> Orchestrator
    TUI -.-> Orchestrator
    API -.-> Orchestrator

    Orchestrator --> Blackboard
    Orchestrator --> Memory
    Orchestrator --> Worktrees

    Blackboard <--> Planner
    Blackboard <--> Builder
    Blackboard <--> Validator
Loading

Common workflows

Task What Tandem does
Refactor a codebase safely Scans files, proposes a staged plan, shows diffs, and applies approved changes
Research and summarize sources Reads multiple references and outputs structured summaries
Generate recurring reports Runs scheduled automations and produces markdown/dashboard artifacts
Connect external tools through MCP Uses configured MCP connectors with approval-aware execution
Operate AI workflows via API Run sessions through local/headless HTTP + SSE endpoints

Features

Engine-Owned Workflow Runtime

  • Coordinated autonomous work: Explicit blackboards over conversational thread dumping.
  • Multi-simultaneous agents: Manage parallel execution through Git Worktree Isolation and patch streams.
  • State survival: Checkpoints, replayable event history, and materialized run states.
  • Approval gates: Keep humans in control with supervised tool flows for destructive actions.

Multi-Agent Orchestration

  • Kanban-driven execution: Agents claim tasks, report blockers, and hand off work through deterministic state.
  • Memory-aware swarms: Agents learn from prior runs, extracting fixes and failure patterns automatically.
  • Revisioned coordination: Engine-enforced locks prevent agents from trampling the same codebase simultaneously.

Integrations and automation

  • MCP tool connectors
  • Scheduled automations and routines
  • Headless runtime with HTTP + SSE APIs
  • Desktop runtime for Windows, macOS, and Linux

Security and local-first controls

  • API keys encrypted in local SecureKeyStore (AES-256-GCM)
  • Workspace access is scoped to folders you explicitly grant
  • Write/delete operations require approval via supervised tool flow
  • Sensitive paths denied by default (.env, .ssh/*, *.pem, *.key, secrets folders)
  • No analytics or call-home telemetry from Tandem itself

Outputs and artifacts

  • Markdown reports
  • HTML dashboards
  • PowerPoint (.pptx) generation

Programmatic API

The SDKs are API clients. They do not bundle tandem-engine.
You need a running Tandem runtime (desktop sidecar or headless engine) and then use the SDKs to create sessions, trigger runs, and stream events.

Runtime options:

  • Desktop app running locally (starts the sidecar runtime)

  • Headless engine via npm:

    npm install -g @frumu/tandem
    tandem-engine serve --hostname 127.0.0.1 --port 39731
  • TypeScript SDK: @frumu/tandem-client

  • Python SDK: tandem-client

  • Engine package: @frumu/tandem

// npm install @frumu/tandem-client
import { TandemClient } from "@frumu/tandem-client";

const client = new TandemClient({ baseUrl: "http://localhost:39731", token: "..." });
const sessionId = await client.sessions.create({ title: "My agent" });
const { runId } = await client.sessions.promptAsync(sessionId, "Summarize README.md");

for await (const event of client.stream(sessionId, runId)) {
  if (event.type === "session.response") process.stdout.write(event.properties.delta ?? "");
}
# pip install tandem-client
from tandem_client import TandemClient

async with TandemClient(base_url="http://localhost:39731", token="...") as client:
    session_id = await client.sessions.create(title="My agent")
    run = await client.sessions.prompt_async(session_id, "Summarize README.md")
    async for event in client.stream(session_id, run.run_id):
        if event.type == "session.response":
            print(event.properties.get("delta", ""), end="", flush=True)
Tandem AI Workspace

Provider setup

Configure providers in Settings.

Provider Description Get API key
OpenRouter Access many models through one API openrouter.ai/keys
OpenCode Zen Fast, cost-effective models optimized for coding opencode.ai/zen
Anthropic Anthropic models (Sonnet, Opus, Haiku) console.anthropic.com
OpenAI GPT models and OpenAI endpoints platform.openai.com
Ollama Local models (no remote API key required) Setup Guide
Custom OpenAI-compatible API endpoint Configure endpoint URL

Design principles

  • Local-first runtime: Data and state stay on your machine unless you send prompts/tools to configured providers.
  • Supervised execution: AI runs through controlled tools with explicit approvals for write/delete operations.
  • Provider agnostic: Route through the model providers you choose.
  • Open source and auditable: MIT repo license and MIT OR Apache-2.0 for Rust crates.

Security and privacy

  • Telemetry: Tandem does not include analytics/tracking or call-home telemetry.
  • Provider traffic: AI request content is sent only to endpoints you configure (cloud providers or local Ollama/custom endpoints).
  • Network scope: Desktop runtime communicates with the local sidecar (127.0.0.1) and configured endpoints.
  • Updater/release checks: App update and release metadata flows can contact GitHub endpoints.
  • Credential storage: Provider keys are stored encrypted (AES-256-GCM).
  • Filesystem safety: Access is scoped to granted folders; sensitive paths are denied by default.

For the full threat model and reporting process, see SECURITY.md.

Learn more

Advanced MCP behavior (including OAuth/auth-required flows and retries) is documented in docs/ENGINE_CLI.md.

Advanced setup (build from source)

Prerequisites

Platform Additional requirements
Windows Build Tools for Visual Studio
macOS Xcode Command Line Tools: xcode-select --install
Linux libwebkit2gtk-4.1-dev, libappindicator3-dev, librsvg2-dev, build-essential, pkg-config

Local development

git clone https://github.com/frumu-ai/tandem.git
cd tandem
pnpm install
cargo build -p tandem-ai
pnpm tauri dev

Production build and signing notes

pnpm tauri build

For local self-built updater artifacts, generate your own signing keys and configure:

  1. pnpm tauri signer generate -w ./src-tauri/tandem.key
  2. TAURI_SIGNING_PRIVATE_KEY
  3. TAURI_SIGNING_PASSWORD
  4. pubkey in src-tauri/tauri.conf.json

Reference: Tauri signing documentation

Output paths:

# Windows: src-tauri/target/release/bundle/msi/
# macOS:   src-tauri/target/release/bundle/dmg/
# Linux:   src-tauri/target/release/bundle/appimage/

macOS install troubleshooting

If a downloaded .dmg shows "damaged" or "corrupted", Gatekeeper is usually rejecting an app bundle/DMG that is not Developer ID signed and notarized.

  1. Confirm the correct architecture (aarch64/arm64 vs x86_64/x64).
  2. Try opening via Finder (Right click -> Open or System Settings -> Privacy & Security -> Open Anyway).
  3. For non-technical distribution, ship signed + notarized artifacts from release automation.

Contributing

Contributions are welcome. See CONTRIBUTING.md.

# Run lints
pnpm lint

# Run tests
pnpm test
cargo test

# Format code
pnpm format
cargo fmt

Engine-specific build/run/smoke instructions: docs/ENGINE_TESTING.md
Engine CLI usage reference: docs/ENGINE_CLI.md
Engine runtime communication contract: docs/ENGINE_COMMUNICATION.md

Maintainer release note

  • Desktop binary/app release: .github/workflows/release.yml (tag pattern v*)
  • Registry publish (crates.io + npm wrappers): .github/workflows/publish-registries.yml (manual trigger or publish-v*)
  • The workflows are intentionally separate

Project structure

tandem/
├── src/                    # React frontend
│   ├── components/         # UI components
│   ├── hooks/              # React hooks
│   └── lib/                # Utilities
├── src-tauri/              # Rust backend
│   ├── src/                # Rust source
│   ├── capabilities/       # Permission config
│   └── binaries/           # Sidecar (gitignored)
├── scripts/                # Build scripts
└── docs/                   # Documentation

Roadmap

  • Phase 1: Security Foundation - Encrypted vault, permission system
  • Phase 2: Sidecar Integration - Tandem agent runtime
  • Phase 3: Glass UI - Modern, polished interface
  • Phase 4: Provider Routing - Multi-provider support
  • Phase 5: Agent Capabilities - Multi-mode agents, execution planning
  • Phase 6: Project Management - Multi-workspace support
  • Phase 7: Advanced Presentations - PPTX export engine, theme mapping, explicit positioning
  • Phase 8: Brand Evolution - Rubik 900 typography, polished boot sequence
  • Phase 9: Memory & Context - Vector database integration (sqlite-vec)
  • Phase 10: Skills System - Importable agent skills and custom instructions
  • Phase 11: Browser Integration - Web content access
  • Phase 12: Team Features - Collaboration tools
  • Phase 13: Mobile Companion - iOS/Android apps

Support this project

If Tandem saves you time, consider sponsoring development.

❤️ Become a Sponsor

Star history

Star History Chart

License

Acknowledgments

  • Anthropic for the Cowork inspiration
  • Tauri for the secure desktop framework
  • The open source community

About

Your AI coworker for any folder: local-first, secure by design, cross-platform, and built for supervised automation.

Topics

Resources

License

MIT, Apache-2.0 licenses found

Licenses found

MIT
LICENSE
Apache-2.0
LICENSE-APACHE

Contributing

Security policy

Stars

Watchers

Forks

Sponsor this project

 

Packages

 
 
 

Contributors