Enterprise-grade Kafka management with rich metadata, policy enforcement, and batch operations
"Without knowing who owns a topic and what it's used for, Kafka is just a message queue."
π Quick Start β’ β¨ Features β’ π Documentation β’ πΊοΈ Roadmap
If you're new to Kafka-Gov, we recommend the following onboarding path.
-
Choose your mode
- Just want to try the UI and features quickly β Lite Mode (SQLite)
- Running a team PoC or a more production-like environment β Full Stack Mode (Docker + MySQL)
-
Prepare your environment
- Lite Mode:
- Install Python 3.12+ and uv
- If you do not set any DB-related values in
.env, Kafka-Gov will automatically use SQLite
- Full Stack Mode:
- Install Docker / Docker Compose
- Optionally adjust Kafka/Schema Registry settings in
.envfor your environment
- Lite Mode:
-
Configure the metadata database
- Default: when nothing is configured,
sqlite+aiosqlite:///./kafka_gov.dbis used - MySQL example:
KAFKA_GOV_DATABASE_URL=mysql+aiomysql://user:password@mysql:3306/kafka_gov?charset=utf8mb4 - PostgreSQL example:
KAFKA_GOV_DATABASE_URL=postgresql+asyncpg://user:password@postgres:5432/kafka_gov
- Default: when nothing is configured,
-
Run migrations
- Alembic always uses
settings.database.url, so as long as the URL is correct, migrations target the right DB. - Local (Lite Mode) example (recommended):
bash script/migrate.sh # or, if executable ./script/migrate.sh - Advanced (run Alembic directly):
uv run alembic upgrade head
- In Docker environments, the
migrationservice included indocker-compose.ymlis responsible for running migrations.
- Alembic always uses
-
Open the UI and register your first connections
- Open
http://localhost:8000in your browser - Register Kafka Cluster / Schema Registry connections directly through the UI
- From then on, all governance metadata is stored in the selected DB (SQLite/MySQL/Postgres)
- Open
After onboarding, see Quick Start and Configuration for more details.
Kafka-Gov transforms Kafka from a simple message broker into a governed enterprise platform with:
- π·οΈ Rich Metadata: Owner, team, tags, documentation links for every topic
- π‘οΈ Policy Enforcement: Environment-specific rules (naming, replication, ISR)
- π Batch Operations: YAML-based bulk create/update/delete with dry-run
- π¦ Schema Management: Integrated Schema Registry with auto-correlation
- π Real-time Monitoring: Consumer lag, fairness index, stuck partition detection
- π Complete Audit Trail: Track every change (who, when, what, why)
| Traditional Tools | Kafka-Gov |
|---|---|
| β No ownership tracking | β Mandatory owner, team, tags |
| β No policy enforcement | β Environment-specific validation |
| β Manual one-by-one operations | β YAML-based batch operations |
| β No audit trail | β Complete change history |
| β Separate schema tool | β Integrated schema management |
Problems we solve:
- π€ Who owns this topic? β Track ownership across hundreds of topics
- π What is it for? β Required documentation links
β οΈ Policy violations? β Auto-detect risky configs before deployment- π Bulk operations? β Create 50+ topics in one YAML file
- π Change history? β Complete audit trail with before/after snapshots
Although we initially approached this project from a governance perspective, over time it started to drift toward operational concerns. To realign with its original direction β governance β we are refocusing on topics and scenario-based policy alerts as the core of the project.
Kafka-Gov supports Airflow-style metadata DB switching.
For local development or quick evaluation, Kafka-Gov uses a SQLite file as the metadata store.
# 1. Clone and setup
git clone https://github.com/limhaneul12/kafka-gov.git
cd kafka-gov
cp .env.example .env
# 2. (optional) If you do not set any DB env vars, SQLite is used by default
# When KAFKA_GOV_DATABASE_URL is unset, ./kafka_gov.db is created/used automatically
# 3. Install dependencies
uv sync
# 4. Run DB migrations (uses settings.database.url β default SQLite)
bash script/migrate.sh
# 5. Start backend API
uv run uvicorn app.main:app --reload
# 6. (optional) Start frontend (from ./frontend)
# pnpm install
# pnpm devIn this mode, the local file ./kafka_gov.db is used as the metadata database.
For production-like setups, use Docker Compose to start MySQL/Kafka/Schema Registry/Redis together.
# 1. Clone and setup
git clone https://github.com/limhaneul12/kafka-gov.git
cd kafka-gov
cp .env.example .env
# 2. Start all services (includes MySQL-backed metadata DB)
docker-compose up -d
# 3. Access web UI (proxied by nginx)
open http://localhost:8000That's it! π
See Quick Start Guide for more details.
Every topic includes owner, team, documentation URL, and custom tags:
name: prod.orders.created
metadata:
owner: team-commerce
doc: "https://wiki.company.com/orders"
tags: ["orders", "critical", "pii"]Create dozens of topics at once:
kind: TopicBatch
env: prod
items:
- name: prod.orders.created
action: create
config:
partitions: 12
replication_factor: 3Upload β Review dry-run β Apply changes
Environment-specific rules prevent production incidents:
| Policy | DEV | PROD |
|---|---|---|
| Min Replication | β₯ 1 | β₯ 3 |
| Min ISR | β₯ 1 | β₯ 2 |
| 'tmp' prefix | β | π« |
- Consumer lag tracking with p50/p95/max metrics
- Fairness index (Gini coefficient) for partition distribution
- Stuck partition detection with configurable thresholds
- Rebalance stability scoring with time windows
- WebSocket streaming for live updates
- π Topic Management
- π Batch Operations
- π‘οΈ Policy Enforcement
- π¦ Schema Registry
- π Real-time Monitoring
- π All Features
Backend: Python 3.12+ β’ FastAPI β’ Pydantic v2 β’ SQLAlchemy 2.0 β’ Confluent Kafka
Frontend: React 19 β’ TypeScript β’ TailwindCSS β’ Rolldown
Infrastructure: SQLite (Lite Mode) β’ MySQL (Production) β’ Kafka β’ Schema Registry β’ MinIO
v1.0 (Current):
- β Core governance features
- β Real-time monitoring
- β Policy enforcement
v1.1 (In Progress):
- π Enhanced frontend filters
- π Preset management UI
v2.0 (Planned):
- π RBAC & multi-tenancy
- π Prometheus/Grafana integration
- π GitOps integration
Contributions welcome! Please read our Contributing Guide before submitting PRs.
# Setup development environment
uv sync
uv run pytest --cov=app
# Code standards
uv run ruff check app/
uv run ruff format app/MIT License - see LICENSE for details.
Make Kafka safer and more efficient π
Made with β€οΈ by developers, for developers
β Star if you find this useful! β


