A beautiful, production-ready log monitoring system with AI-powered insights using Groq, real-time WebSocket streaming, and a stunning React dashboard.
- β‘ Real-time Log Processing - Asynchronous queue-based processing with Bull & Redis
- π€ AI-Powered Insights - Groq AI analyzes error logs and provides root cause analysis
- π Metrics & Analytics - Real-time metrics calculation and historical tracking
- π WebSocket Streaming - Live log broadcasting to connected clients
- πΎ PostgreSQL Storage - Persistent log storage with efficient indexing
- π¨ Smart Alerting - Configurable alert triggers for error rates and critical keywords
- π‘ REST API - Complete API for log ingestion and retrieval
- π³ Fully Dockerized - One command setup with Docker Compose
- π¨ Beautiful UI - Glassmorphism design with smooth animations
- β¨ Framer Motion - Smooth transitions and micro-interactions
- π Live Charts - Real-time metrics visualization with Recharts
- π Advanced Filtering - Search and filter by level, source, and time
- π€ AI Insights Display - Beautiful presentation of Groq AI analysis
- π Real-time Updates - WebSocket connection for instant log streaming
- π― Toast Notifications - Non-intrusive alerts for errors and insights
[Clients] β [REST API] β [Bull Queue + Redis] β [Workers] β [PostgreSQL]
β
[WebSocket Server] β [React Dashboard]
β
[Groq AI] β [AI Insights]
Docker Setup (Recommended):
- Docker Desktop - Download here
- Groq API Key (free from console.groq.com)
Manual Setup:
- Node.js 20+ and npm
- PostgreSQL 15+
- Redis 7+
- Groq API Key
This is the easiest way to run LogStream β no need to install or run PostgreSQL or Redis separately.
git clone https://github.com/your-username/logstream-backend.git
cd logstream-backendcp .env.example .envEdit .env and fill in your values:
POSTGRES_USER=your_db_user
POSTGRES_PASSWORD=your_db_password
GROQ_API_KEY=gsk_your_groq_api_key_hereMake sure
REDIS_HOST=redisandPOSTGRES_HOST=dbin your.env(already set in.env.example)
docker compose up --buildThat's it! All three services (app, PostgreSQL, Redis) start automatically.
- REST API:
http://localhost:3001/api - Health Check:
http://localhost:3001/api/health - WebSocket:
ws://localhost:3001
docker compose downdocker compose down -vFrontend repo: https://github.com/dishamurthy/logstream-dashboard
Step 1 β Start backend first (this repo):
cd logstream-backend
cp .env.example .env # fill in your values
docker compose up --buildStep 2 β In a new terminal, start frontend:
cd logstream-dashboard
docker compose up --buildStep 3 β Open browser:
- Dashboard β
http://localhost:5173 - API β
http://localhost:3001/api
cd logstream-backendnpm install express bull ioredis pg socket.io cors dotenv winston joi uuid axios
npm install -D typescript ts-node-dev @types/express @types/node @types/bull @types/cors @types/uuid jest @types/jestCreate .env file:
# Server Configuration
PORT=3000
NODE_ENV=development
# Redis Configuration
REDIS_HOST=localhost
REDIS_PORT=6379
REDIS_PASSWORD=
# PostgreSQL Configuration
POSTGRES_HOST=localhost
POSTGRES_PORT=5432
POSTGRES_DB=logstream
POSTGRES_USER=your_username
POSTGRES_PASSWORD=your_password
# WebSocket Configuration
WS_PORT=3001
WS_CORS_ORIGIN=http://localhost:5173
# Queue Configuration
QUEUE_CONCURRENCY=5
QUEUE_MAX_RETRIES=3
# Log Retention (in days)
LOG_RETENTION_DAYS=30
# Alert Thresholds
ERROR_RATE_THRESHOLD=10
LATENCY_THRESHOLD_MS=1000
# Groq AI Configuration
GROQ_API_KEY=gsk_your_groq_api_key_here
GROQ_MODEL=llama-3.3-70b-versatile
GROQ_API_URL=https://api.groq.com/openai/v1/chat/completions
GROQ_ENABLED=truebrew install postgresql@15 redis
brew services start postgresql@15
brew services start redis
createdb logstream# Terminal 1: API Server
npm run dev
# Terminal 2: Worker Process
npm run workercd ~/Desktop
npm create vite@latest logstream-dashboard -- --template react-ts
cd logstream-dashboard
npm install socket.io-client recharts framer-motion lucide-react react-hot-toast date-fns
npm install -D tailwindcss postcss autoprefixer
npx tailwindcss init -p
npm run devOpen browser at: http://localhost:5173
GET /api/healthPOST /api/logs
Content-Type: application/json
{
"level": "error",
"message": "Database connection failed",
"source": "api_gateway",
"userId": "user-123",
"metadata": { "ip": "192.168.1.1" }
}POST /api/logs/batch
Content-Type: application/json
{
"logs": [
{ "level": "info", "message": "User logged in", "source": "service" },
{ "level": "error", "message": "Payment failed", "source": "service" }
]
}GET /api/logs?level=error&page=1&pageSize=50
GET /api/logs?source=api_gateway&startTime=2024-01-01T00:00:00ZGET /api/metrics?window=5GET /api/queue/statsconst socket = io('ws://localhost:3001');
socket.on('log:new', (event) => console.log('New log:', event.data));
socket.on('metrics:update', (event) => console.log('Metrics:', event.data));
socket.on('alert:triggered', (event) => console.log('Alert:', event.data));
socket.emit('subscribe:level', 'error');
socket.emit('subscribe:source', 'api_gateway');Error/Fatal logs are automatically sent to Groq AI which returns:
- Summary: Brief description of the issue
- Root Cause: Likely cause of the error
- Suggested Fix: How to resolve the issue
- Severity: low, medium, high, or critical
- Visit console.groq.com
- Sign up / Login β API Keys β Create API Key
- Add to
.envasGROQ_API_KEY=gsk_...
CREATE TABLE logs (
id VARCHAR(36) PRIMARY KEY,
level VARCHAR(10) NOT NULL,
message TEXT NOT NULL,
source VARCHAR(50) NOT NULL,
timestamp TIMESTAMPTZ NOT NULL,
parsed_at TIMESTAMPTZ NOT NULL,
metadata JSONB,
user_id VARCHAR(100),
request_id VARCHAR(100),
session_id VARCHAR(100),
tags TEXT[],
stack_trace TEXT,
context JSONB,
ai_insight JSONB
);node test-client.js single # Send single log
node test-client.js batch 20 # Send batch of 20 logs
node test-client.js simulate 120000 # Simulate traffic for 2 minutes
node test-client.js errors # Send error logs (triggers AI analysis)# Manual curl test
curl -X POST http://localhost:3001/api/logs \
-H "Content-Type: application/json" \
-d '{"level": "error", "message": "Payment processing failed", "source": "service"}'logstream-backend/
βββ src/
β βββ api/routes.ts
β βββ workers/logProcessor.ts
β βββ config/index.ts
β βββ types/index.ts
β βββ utils/
β β βββ database.ts
β β βββ queue.ts
β β βββ websocket.ts
β β βββ validation.ts
β β βββ groq.ts
β βββ index.ts
βββ Dockerfile
βββ docker-compose.yml
βββ .dockerignore
βββ .env.example # Safe to commit - no real secrets
βββ .env # Never committed - your real secrets
βββ .gitignore
βββ package.json
βββ tsconfig.json
Docker: Port already in use
lsof -ti:3001 | xargs kill -9
docker compose down
docker compose up --buildDocker: Database errors on startup
docker compose down -v # clears volumes
docker compose up --buildManual: Redis connection failed
brew services restart redis
redis-cli ping # Should return PONGManual: PostgreSQL connection failed
brew services restart postgresql@15
psql -lGroq API errors
- Check your API key in
.env - Verify
GROQ_ENABLED=true - Check quota at console.groq.com
.envis gitignored β never commit real secrets- Use
.env.examplewith dummy values for sharing - Add authentication before production deployment
- Implement rate limiting on API endpoints
- Fork the repository
- Create feature branch:
git checkout -b feature/amazing-feature - Commit:
git commit -m 'Add amazing feature' - Push:
git push origin feature/amazing-feature - Open a Pull Request
MIT License - feel free to use for personal or commercial purposes.
Built with β€οΈ using React, TypeScript, Node.js, PostgreSQL, Redis, Groq AI, and Docker
β Star this repo if you find it useful!




