Skip to content

An edge-optimized AI agent that synthesizes Frigate NVR detections and Home Assistant sensor context into semantic, narrative security digests using local LLMs.

License

Notifications You must be signed in to change notification settings

michaelwoods/aegis-analyst

Repository files navigation

Aegis Analyst for Frigate NVR

A resilient, edge-optimized middleware that bridges the gap between Frigate NVR's object detection and Local LLMs (via Ollama) to provide rich, narrative summaries of security events to Home Assistant.

πŸš€ Key Features

  • Edge Optimized: Designed specifically for consumer hardware (e.g., NVIDIA GTX 1060 6GB).
  • Fail-Over Inference: Automatically switches from a high-quality 7B model (LLaVA) to a lightweight 1.8B model (Moondream) if VRAM is exhausted, ensuring continuous operation.
  • Home Assistant Integration: Automatically discovered via MQTT, pushing a daily "AI Digest" and granular event statistics.
  • Batch Processing: Intelligently batches requests to prevent thermal throttling and maintain system stability.
  • Frigate Write-back: (Optional) Automatically updates Frigate events with rich AI-generated descriptions or sub-labels.
  • Context Aware: Correlates visual events with Home Assistant sensor data (e.g., "Person detected" + "Door opened").

πŸ› οΈ Architecture

graph LR
    Frigate[Frigate NVR] -- Review Items & Snapshots --> Agent[Aegis Analyst]
    Agent -- Tier 1/3 Requests --> Ollama[Ollama LLM]
    Agent -- Tier 2 Context --> HA[Home Assistant]
    Agent -- Discovery & States --> MQTT[MQTT Broker]
    MQTT --> HA
Loading

🧠 AI Inference Pipeline

graph TD
    subgraph "Tier 1: Perception"
    F[Frigate Review Item] -- Native GenAI or VLM --> Desc[Visual Description]
    end

    subgraph "Tier 2: Correlation"
    Desc -- + HA History --> Corr[Semantic Narrative]
    end

    subgraph "Tier 3: Synthesis"
    Corr -- Text LLM --> Brief[Daily Security Briefing]
    end

    Brief -- MQTT --> HA_S[Home Assistant Sensors]
Loading
  1. Perception (Tier 1): Frigate NVR detects objects and (ideally) generates a description using its native Generative AI. The agent retrieves these Review Items, prioritizing existing descriptions. If missing, it fallbacks to its own VLM logic.
  2. Analysis (Tier 2): The agent performs Semantic Correlation. It correlates the visual description with Home Assistant's History API (e.g., "Person detected" + "Door opened") to produce a cohesive narrative.
  3. Reporting (Tier 3): A Text LLM (e.g., Llama 3) synthesizes the day's enriched narratives into a dense, narrative Daily Report.
  4. Presentation: Results are published via MQTT as granular sensors for easy graphing and dashboarding.

πŸ“‹ Prerequisites

  • Inference Backend (Ollama):
    • NVIDIA GPU (Recommended: 6GB+ VRAM).
    • Ollama (v0.1.30+) running on host or in a separate container.
  • Frigate NVR (v0.14+):
    • It is highly recommended to enable Frigate's native Generative AI for Tier 1 analysis.

Configuring Frigate Native GenAI (Recommended)

To offload perception to Frigate, add the following to your config.yml:

genai:
  enabled: True
  provider: ollama
  base_url: http://ollama:11434
  model: llava:7b # Or any supported VLM

Aegis Analyst will automatically detect these descriptions and skip its internal VLM step, focusing entirely on Home Assistant correlation.

πŸ–₯️ Hardware Constraints (Example: GTX 1060 6GB)

This project is optimized for edge hardware with limited VRAM.

  • Primary Vision Model: llava:7b (Quantized Q4_K_M takes ~4.8GB VRAM).
  • Fallback Model: moondream (Takes ~1.6GB VRAM).
  • Fail-over Pattern: If the primary model causes an Out-Of-Memory (OOM) error, the system automatically degrades to the fallback model to ensure reporting continuity.

βš™οΈ Installation & Setup

1. Prepare Ollama

Run these commands on your host machine to pull the required models:

# Vision Model
ollama pull llava:7b

# Analyst Model
ollama pull llama3

# Fallback Model (Edge optimized)
ollama pull moondream

2. Deployment options

Option A: Docker Compose (Recommended)

Create a docker-compose.yml file and mount your configuration:

services:
  aegis-analyst:
    image: ghcr.io/michaelwoods/aegis-analyst:latest
    container_name: aegis-analyst
    restart: unless-stopped
    volumes:
      - ./config.yaml:/app/config.yaml:ro
      - ./data:/app/data
    environment:
      - TZ=America/New_York

Start the agent:

docker compose up -d aegis-analyst

Option B: Manual Installation (Development)

Clone the repository and set up your environment:

git clone https://github.com/your-username/aegis-analyst.git
cd aegis-analyst
python3 -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt
cp config.example.yaml config.yaml

Edit config.yaml to match your network environment:

  • Frigate URL: e.g., http://192.168.1.50:5000
  • Ollama:
    • vision_model: e.g., llava:7b
    • text_model: e.g., llama3
    • fallback_model: e.g., moondream
  • MQTT Broker: e.g., 192.168.1.50
  • Home Assistant:
    • context_entities: List of sensors (e.g., binary_sensor.front_door) to query for historical context.

3. Automation (Optional)

To run the agent on a schedule (e.g., every hour) without Docker, you can use Cron:

# Example: Run every hour
0 * * * * /path/to/aegis-analyst/.venv/bin/python /path/to/aegis-analyst/agent.py >> /var/log/aegis_analyst.log 2>&1

πŸ“Š Home Assistant Dashboard

The agent uses MQTT Discovery to automatically create devices and entities.

Aegis Analyst Dashboard

Key Sensors

  • sensor.aegis_analyst_status: Current operational state.
  • sensor.aegis_analyst_daily_security_report: Full narrative summary in its content attribute.
  • sensor.aegis_analyst_[label]_count: Individual count sensors for each object type.

Lovelace Card Example

Visualize the report using a Vertical Stack card:

type: vertical-stack
cards:
  - type: entities
    entities:
      - entity: sensor.aegis_analyst_status
        name: Status
      - entity: sensor.aegis_analyst_bird_count
        name: Birds Detected
      - entity: sensor.aegis_analyst_car_count
        name: Cars Detected
      - entity: sensor.aegis_analyst_person_count
        name: People Detected
  - type: markdown
    title: Daily Security Briefing
    content: |
      {{ state_attr('sensor.aegis_analyst_daily_security_report', 'content') }}

πŸ§‘β€πŸ’» Development

We use a Makefile to enforce quality standards.

make install   # Setup environment
make lint      # Run Ruff linter
make type-check# Run MyPy
make test      # Run all tests
make check     # Run all of the above

πŸ› οΈ Troubleshooting

  • requests.exceptions.ConnectionError: Ensure Ollama is running and binding to 0.0.0.0 if the agent is running inside Docker.
  • Ollama error: CUDA out of memory: The agent should auto-switch to moondream. If it still fails, reduce batch_limit in config.yaml.
  • Home Assistant shows "Unknown": The sensors are only created after the first successful run.

🀝 Contributing

  1. Fork the repository.
  2. Create a feature branch (git checkout -b feature/amazing-feature).
  3. Commit your changes.
  4. Run tests (make check).
  5. Open a Pull Request.

πŸ“„ License

MIT License - Copyright (c) 2026 Michael Woods

About

An edge-optimized AI agent that synthesizes Frigate NVR detections and Home Assistant sensor context into semantic, narrative security digests using local LLMs.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors 2

  •  
  •  

Languages