FastAPI + YOLOv8 backend for its React dashboard. Streams annotated MJPEG video and live metrics over WebSockets. Allows polygon zones, human occupancy, dwell time, and recent events.
- Backend: FastAPI, Ultralytics YOLOv8, WebSockets, PyTorch (CUDA), OpenCV
- Dev: Conda env, Windows/RTX GPU
conda env create -f environment.yml
conda activate yolo
# copy config and edit if needed (e.g., SOURCE, ALLOW_ORIGINS)
cp .env.example .env
# run API (LAN-accessible)
uvicorn app.main:app --host 0.0.0.0 --port 8000 --reloadSOURCE— camera index like0(DSHOW) or RTSP/HTTP URLIMG_SIZE,CONF,STREAM_FPS,JPEG_QUALITY,HALFALLOW_ORIGINS— UI origin(s), e.g.http://localhost:5173TRACK_MISS_TTL,EVENTS_MAXOCCUPANCY_CLASSES— CSV of classes to count (defaultperson)YOLO_WEIGHTS— e.g.yolov8n.pt
Zones persist to zones.json.
GET /video— MJPEG stream of annotated framesWS /ws— live metrics (~5 Hz): counts, detections (withzone,dwell_s),occupancy,recent_eventsGET /zones/PUT /zones— read/write polygon zonesGET /events?limit=50— recent entry/exit/transfer events (in-memory)GET /health— basic runtime info
- For phone/another device: set
ALLOW_ORIGINS=http://<your-ip>:5173and point your UI athttp://<your-ip>:8000. - Human-only occupancy is on by default via
OCCUPANCY_CLASSES=person.