Moirai is a GenAI-native press review platform designed for the Model Context Protocol (MCP) ecosystem. It allows AI agents to aggregate RSS feeds, synthesize them into "Events," and track long-term "Trends" across isolated userspaces.
Moirai acts as a synthesis engine that LLM agents use to analyze news. It consists of:
- MCP Server (Port 8090): The primary interface for AI agents. Provides tools to fetch news, create events, and track trends via Server-Sent Events (SSE).
- Admin UI (Port 8088): A Vue.js dashboard for humans to review the agent's work, manage feeds, and visualize the data.
- REST API (Port 8088): The backend for the UI and external integrations.
Architecture Details: See docs/ARCHITECTURE.md.
The easiest way to run the full stack (UI, API, Worker, MCP Server, Database):
docker compose up --buildThis starts the following services:
api— Flask REST API (stateless, scalable web server)worker— Enrichment worker (long-running CouchDB changes-feed listener, single instance)ui— Vue.js frontend (served by Nginx)nginx— Reverse proxy routing traffic toapianduimcp-server— FastMCP server for agent tool callscouchdb— Databaseredis— Cache
Endpoints:
- Admin UI: http://localhost:8088
- MCP Server: http://localhost:8090/sse
- API: http://localhost:8088/api
To run the feed scheduler manually (one cycle, then exit):
docker compose run --rm api python run_scheduler.pyThe UI includes a Chat interface to interact with the Agent.
- Go to Chat in the UI.
- Open Settings.
- Configure your LLM endpoint (e.g., Ollama at
http://host.docker.internal:11434/v1) and model (e.g.,llama3.1).
- Add Feeds: Ask the agent to "Add the RSS feed for Hacker News".
- Synthesize: Ask the agent to "Check recent articles and create events for major stories".
- Review: Use the dashboard to see the Events and Trends created by the agent.
.
├── api/ # Flask REST API (Port 8088)
├── ui/ # Vue.js Frontend
├── mcp_server.py # FastMCP Server (Port 8090)
├── tasks/ # Background tasks and helpers
├── tests/ # Unit and integration tests
├── helm/ # Kubernetes charts
└── docker-compose.yml
If you want to run components individually without Docker:
-
Install Miniforge (if needed):
# Linux x86_64 curl -L -o /tmp/Miniforge3.sh https://github.com/conda-forge/miniforge/releases/latest/download/Miniforge3-Linux-x86_64.sh bash /tmp/Miniforge3.sh -
Create/Use a Miniforge Environment:
mamba env create -f environment.yml mamba activate moirai
-
Database: Ensure CouchDB is running (e.g., via
docker compose up couchdb). -
API:
mamba run -n moirai python main.py
-
MCP Server:
mamba run -n moirai python mcp_server.py
-
UI:
cd ui yarn install yarn dev
Before committing and pushing changes, run relevant local tests for the area you touched (API, UI, MCP, or integration). If local tests cannot be run, explicitly document why (and what you did instead) in the commit message body or pull request description.
- Architecture
- LLM Configuration - How to configure Ollama, OpenAI, and Gemini.
- External REST API - Documentation for
/mcpendpoints on the REST API. - Release Process - Release and RC checklist.
- Example Usage
MIT License