OpenTelemetry-native run-level cost attribution for AI workflows.
Botanu adds runs on top of distributed tracing. A run represents a single business transaction that may span multiple LLM calls, database queries, and services. By correlating all operations to a stable run_id, you get accurate cost attribution without sampling artifacts.
from botanu import enable, botanu_use_case
enable(service_name="my-service")
@botanu_use_case(name="my_workflow")
def my_function():
data = db.query(...)
result = llm.complete(...)
return resultpip install "botanu[all]"| Extra | Description |
|---|---|
sdk |
OpenTelemetry SDK + OTLP exporter |
instruments |
Auto-instrumentation for HTTP, databases |
genai |
Auto-instrumentation for LLM providers |
all |
All of the above (recommended) |
When you install botanu[all], the following are automatically tracked:
- LLM Providers — OpenAI, Anthropic, Vertex AI, Bedrock, Azure OpenAI
- Databases — PostgreSQL, MySQL, SQLite, MongoDB, Redis
- HTTP — requests, httpx, urllib3, aiohttp
- Frameworks — FastAPI, Flask, Django, Starlette
- Messaging — Celery, Kafka
No manual instrumentation required.
For large-scale deployments (2000+ services):
| Service Type | Code Change | Kubernetes Config |
|---|---|---|
| Entry point | @botanu_use_case decorator |
Annotation |
| Intermediate | None | Annotation only |
# Intermediate services - annotation only, no code changes
metadata:
annotations:
instrumentation.opentelemetry.io/inject-python: "true"Auto-instrumentation captures all HTTP calls including retries (requests, httpx, aiohttp, urllib3).
See Kubernetes Deployment Guide for details.
- Python 3.9+
- OpenTelemetry Collector (recommended for production)
See CONTRIBUTING.md. This project uses DCO sign-off.
git commit -s -m "Your commit message"This project is an LF AI & Data Foundation project.