This project provides a complete pipeline to ingest .jsonl
-formatted logs into Grafana Loki using Promtail, and visualize them in Grafana.
- Ingest
.jsonl
structured logs via custom Python script - Pre-configured
docker-compose
setup for Loki, Promtail, and Grafana - Real-time log visualization in Grafana Explore
- Supports ingestion of large datasets with rate limit management
.
├── docker-compose.yml # Loki, Promtail, Grafana services
├── loki-config.yaml # Loki configuration
├── promtail-config.yaml # Promtail configuration
├── upload_all_json.py # Python uploader for JSONL files
git clone https://github.com/yourusername/loki-jsonl-log-ingestion.git
cd loki-jsonl-log-ingestion
sudo docker-compose up -d
This will start:
- Loki (default port:
3100
) - Promtail (default port:
9080
) - Grafana (default port:
3000
)
python3 upload_all_jsonl.py ./by_source/
⚠️ Note: Some files may exceed the default Loki ingestion limit (e.g., 4MB). You'll need to split or downsample large files or adjust Loki's limits in the configuration.
- Visit http://localhost:3000
- Default credentials:
- Username:
admin
- Password:
admin
(you'll be prompted to change it)
- Username:
- Go to Explore
- Select Loki as the data source
- Use queries like:
{job="files"}
- Promtail is configured to tail files in
/var/log/*.log
and ingest sample.jsonl
logs - You can customize
promtail-config.yaml
to change file targets or relabel log streams
- 500 ResourceExhausted: File exceeds maximum allowed size (default 4MB). Try chunking the data.
- 429 Rate limit exceeded: Too many log entries per second. Use
time.sleep()
or chunk uploads. - Use
curl http://localhost:3100/ready
to verify Loki is ready. - Use
docker-compose logs
to debug individual service logs.