Data Nadhi is an open-source platform that helps you manage the flow of data starting from your application logs all the way to your desired destinations — databases, APIs, or alerting systems.
Direct. Transform. Deliver.
Flow your logs, trigger your pipelines.
Data Nadhi provides a unified platform to ingest, transform, and deliver data — powered by Temporal, MongoDB, Redis, and MinIO.
It connects easily with your applications using the Data Nadhi SDK, and gives you full control over how data moves across your system.
- Direct – Collect logs and data from your applications or external sources.
- Transform – Use Temporal workflows to apply filters, enrichments, or custom transformations.
- Deliver – Send the final processed data to any configured destination — all handled reliably and asynchronously.
Data Nadhi is designed to be modular, developer-friendly, and ready for production.
The platform is built from multiple services and tools working together:
| Component | Description |
|---|---|
| data-nadhi-server | Handles incoming requests from the SDK and passes them to Temporal. |
| data-nadhi-internal-server | Internal service for managing entities, pipelines, and configurations. |
| data-nadhi-temporal-worker | Executes workflow logic and handles transformations and delivery. |
| data-nadhi-sdk | Python SDK for logging and sending data from applications. |
| data-nadhi-dev | Local environment setup using Docker Compose for databases and Temporal. |
| data-nadhi-documentation | Documentation site built with Docusaurus (you’re here now). |
All components are connected through a shared Docker network, making local setup and development simple.
- 🧩 Unified Pipeline – Move data seamlessly from logs to destinations
- ⚙️ Custom Transformations – Define your own transformations using Temporal
- 🔄 Reliable Delivery – Retries, fault tolerance, and monitoring built in
- 🧠 Easy Integration – Simple SDK-based setup for applications
- 💡 Developer Focused – Dev containers and Docker-first setup for consistency
This is the Python SDK for Data Nadhi that acts as a normal logger but also sends the request to the Server using the API when required
- Python(logging) - Framework used to create temporal worker
- Docker – For consistent local and production deployment
- Docker Network (
datanadhi-net) – Shared network for connecting all services locally
- Docker & Docker Compose
- VS Code (with Dev Containers extension)
If you want to modify or test it, open it directly in a Dev Container.
To use it as a package:
- Install it
pip install git+https://github.com/Data-ARENA-Space/data-nadhi-sdk.git
- Add the following to your
.envfileDATA_NADHI_API_KEY=<API-KEY> DATA_NADHI_SERVER_HOST=http://localhost
- If your service runs inside the same Docker network, remove the
DATA_NADHI_SERVER_HOSTvariable.
- If your service runs inside the same Docker network, remove the
- Add Log config file in
.datanadhifolder - See Log Config - Try logging
from dotenv import load_dotenv from datanadhi import DataNadhiLogger load_dotenv() def main(): # Set up API key (in production, use environment variable) # os.environ["DATA_NADHI_API_KEY"] = "dummy_api_key_123" # Initialize logger with module name logger = DataNadhiLogger(module_name="test_app") logger.info( "Testing basic stuff", context={ "user": { "id": "user123", "status": "active", "email_verified": True, "type": "authenticated", "permissions": {"guest_allowed": False}, } }, )
- Main Website: https://datanadhi.com
- Documentation: https://docs.datanadhi.com
- GitHub Organization: Data-ARENA-Space
This project is open source and available under the GNU Affero General Public License v3.0.
- GitHub Discussions: [Coming soon]
- Discord: Data Nadhi Community
- Issues: GitHub Issues