Data Nadhi is an open-source platform that helps you manage the flow of data starting from your application logs all the way to your desired destinations — databases, APIs, or alerting systems.
Direct. Transform. Deliver.
Flow your logs, trigger your pipelines.
Data Nadhi provides a unified platform to ingest, transform, and deliver data — powered by Temporal, MongoDB, Redis, and MinIO.
It connects easily with your applications using the Data Nadhi SDK, and gives you full control over how data moves across your system.
- Direct – Collect logs and data from your applications or external sources.
- Transform – Use Temporal workflows to apply filters, enrichments, or custom transformations.
- Deliver – Send the final processed data to any configured destination — all handled reliably and asynchronously.
Data Nadhi is designed to be modular, developer-friendly, and ready for production.
The platform is built from multiple services and tools working together:
| Component | Description |
|---|---|
| data-nadhi-server | Handles incoming requests from the SDK and passes them to Temporal. |
| data-nadhi-internal-server | Internal service for managing entities, pipelines, and configurations. |
| data-nadhi-temporal-worker | Executes workflow logic and handles transformations and delivery. |
| data-nadhi-sdk | Python SDK for logging and sending data from applications. |
| data-nadhi-dev | Local environment setup using Docker Compose for databases and Temporal. |
| data-nadhi-documentation | Documentation site built with Docusaurus (you’re here now). |
All components are connected through a shared Docker network, making local setup and development simple.
- 🧩 Unified Pipeline – Move data seamlessly from logs to destinations
- ⚙️ Custom Transformations – Define your own transformations using Temporal
- 🔄 Reliable Delivery – Retries, fault tolerance, and monitoring built in
- 🧠 Easy Integration – Simple SDK-based setup for applications
- 💡 Developer Focused – Dev containers and Docker-first setup for consistency
This repository contains the local development setup for the entire Data Nadhi platform.
It uses Docker Compose to spin up all required infrastructure components, so that all services can connect to the same network and be used seamlessly inside VS Code Dev Containers.
- MongoDB – Primary datastore for pipeline and entity configurations
- Redis – Used for caching and quick lookups
- MinIO – S3-compatible object storage for storing logs and temporary files
- Temporal – Workflow orchestration engine to run the data pipelines
- Docker Network (
datanadhi-net) – Shared network for connecting all services locally
- Docker & Docker Compose
- VS Code (with Dev Containers extension)
- Clone the repository
git clone https://github.com/Data-ARENA-Space/data-nadhi-dev.git cd data-nadhi-dev - Create Docker Network
docker network create datanadhi-net
- Start Services
docker compose up -d
- Check Running Containers
docker ps
- Setup MinIO Bucket
docker exec -it datanadhi-minio /bin/bash mc alias set local http://localhost:9000 minio minio123 mc mb local/failure-logs
- Open the repository in Dev Container and run migrations for Mongo
npm run mongo:migrate:up
- All your core data stores and Temporal setup are now running inside Docker and ready to be connected to the rest of the Data Nadhi services.
- Main Website: https://datanadhi.com
- Documentation: https://docs.datanadhi.com
- GitHub Organization: Data-ARENA-Space
This project is open source and available under the GNU Affero General Public License v3.0.
- GitHub Discussions: [Coming soon]
- Discord: Data Nadhi Community
- Issues: GitHub Issues