Skip to content

a warm, ai agent built to support you through migraines and stress with empathy and research-backed relief strategies

Notifications You must be signed in to change notification settings

Ash-Blanc/migru

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

35 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Migru

Migru is a local-first, personal, and private AI-powered companion designed to support you through migraines and stress with empathy and research-backed relief strategies. It combines ultra-fast responses with deep, personalized wisdom, while keeping your data under your control.

🌟 Project Overview

Migru is a warm, AI-powered companion designed to support users through migraines and stress. It prioritizes empathy, research-backed relief strategies, and ultra-fast performance. Now enhanced for the Kaggle Med-Gemma Impact Challenge using Google's Gemma 2.

  • Primary Mission: To walk alongside users with a "wise, humble, and deeply curious" persona, helping them discover wellness patterns.
  • Key Technologies:
    • Language: Python 3.12+ (managed with uv)
    • Agent Framework: Agno AI (formerly Phidata)
    • AI Models:
      • Local: Google Gemma 2 (9B) for clinical insights (Med-Gemma).
      • Cloud Fallback: Mistral Small, Cerebras (llama3.1-8b).
    • Privacy: 100% Local "Edge AI" mode available.
    • Database/Memory: Redis (Conversation history, user profiles, and real-time pattern detection).
    • Streaming Analytics: Pathway for low-latency pattern recognition.
    • UI: Rich for a beautiful terminal-based interface.

🌟 Key Features

  • πŸ”’ Privacy-First Design: Complete local AI processing option with FunctionGemma, Qwen2.5, and other models
  • 🧠 Smart Agent Routing: Intelligent agent selection for optimal responses
  • 🌿 Empathetic Conversations: Therapeutic support optimized for wellness
  • ⚑ Ultra-Fast Responses: Local inference eliminates network latency (1-3 seconds with Cerebras)
  • πŸ” Optional Web Search: Privacy-aware research tools only when you need them
  • πŸ“Š Real-time Analytics: Pattern detection and wellness insights with Pathway
  • 🎨 Beautiful CLI: Rich themes and accessibility features
  • πŸ’Ύ Local-First Storage: Your data stays on your device with Redis

✨ Showcase

Beautiful CLI Experience

Clean, calming interface designed for wellness and relief

Welcome Screen

🌸 Warm welcome with elegant design

Conversation

πŸ’¬ Natural, empathetic conversations

Research Capabilities

πŸ” Intelligent research with fallbacks

Smart Responses

🧠 Thoughtful, context-aware responses

Full Experience

⚑ Ultra-fast responses (1-3 seconds) with Cerebras AI


πŸš€ Building and Running

Prerequisites

  • Python 3.12+
  • uv package manager
  • Redis server (local)
  • Ollama (for local models) -> ollama pull gemma2:9b

Key Commands

  • Install (Global): uv tool install -e . --python 3.12 (local dev) or uv tool install migru --python 3.12 (published)
  • Run (Therapeutic): migru or uv run -m app.main
  • Run (Work Mode): migru --work
  • Medical Analysis: Use /med command in chat.
  • Run (Source): uv run -m app.main
  • Test: pytest (Configured in pyproject.toml)
  • Lint: ruff check .
  • Type Check: mypy app/

πŸ› οΈ Development Conventions

Code Quality

The codebase follows best practices:

  • Type Hints: Comprehensive type annotations
  • Error Handling: Graceful degradation and user-friendly messages
  • Logging: Structured logging for debugging
  • Modularity: Clear separation of concerns
  • Documentation: Comprehensive docstrings and READMEs

Performance

  • Lazy Loading: Heavy dependencies loaded only when needed
  • Service Caching: Frequently accessed services are cached
  • Optimized Imports: Organized imports to avoid circular dependencies
  • Memory Monitoring: Memory usage tracking and optimization

πŸ“š Documentation

Each module includes comprehensive documentation:

  • app/cli/README.md - CLI module documentation
  • app/services/ - Service layer documentation
  • app/agents.py - Agent architecture documentation
  • app/config.py - Configuration documentation

πŸ› οΈ Installation

Prerequisites

  • Python 3.12+
  • Redis (local)
  • uv package manager

1. Install Globally (Recommended)

You can install Migru as a global command line tool directly:

# If running from the source directory (Development)
# Use editable mode and pin Python 3.12 for binary compatibility
uv tool install -e . --python 3.12

# Or install globally without cloning (once published)
uv tool install migru --python 3.12

Then simply run:

migru

2. Configure Environment

Create a .env file in your working directory with your API keys:

# Cloud AI Providers (Default Mode)
MISTRAL_API_KEY=...     # Primary Intelligence
CEREBRAS_API_KEY=...    # Ultra-Fast Responses (Recommended)
OPENROUTER_API_KEY=...  # Fallback Provider

# Optional: Web Search & Weather
FIRECRAWL_API_KEY=...   # Deep Research
OPENWEATHER_API_KEY=... # Environmental Context

# Optional: Local LLM (See Local LLM Support section)
LOCAL_LLM_ENABLED=false
LOCAL_LLM_HOST=http://localhost:8080
PRIVACY_MODE=cloud

πŸ’» Usage & Commands

Migru is primarily a CLI application.

Basic Start

If installed globally:

migru

Or run from source:

# Standard mode (cloud AI)
uv run -m app.main

# Enhanced mode (with local LLM support)
uv run -m app.main_enhanced

Custom User Profile

Start with a specific user context (loads your personal history/patterns):

migru --user "Alex"

Accessibility Mode

Launch with high-contrast UI and reduced motion/animations:

migru --accessible

CLI Flags

Flag Short Description
--user <name> -u Sets the active user profile name (Default: "Friend")
--accessible -a Enables high-contrast, reduced-motion UI
--quiet -q Suppresses startup banner and welcome messages
--verbose -v Shows detailed performance logs and debug info

In-App Commands

Once inside the chat, use these slash commands to interact with the system:

Command Description Example Output
/profile View your learned preferences & bio context Work: Remote, Sensitivities: Light
/patterns See discovered wellness rhythms Peak Symptom Hour: 10:00 AM
/bio <args> Simulate biometric data input /bio hr=110 sleep=60
/model Switch AI models dynamically Switched to Mistral AI
/history View recent conversation memories Last topic: Magnesium for relief
/clear Clear the terminal screen (Clears screen)
/exit End the session gracefully (Saves state and exits)

Local LLM Commands (When Enabled)

Command Description Example
/privacy status Check current privacy settings Shows local/hybrid/cloud mode
/privacy local Switch to 100% private mode Disables all external APIs
/privacy hybrid Local AI + optional search Balance privacy and features
/local status Show current local model Using Qwen2.5:3B
/local models List available local models Shows all downloaded models
/local switch <model> Switch local models /local switch qwen2.5:3b
/local test Test local LLM connection Verifies server is running

🧠 Multimodal Simulation

You can simulate wearable/sensor data directly from the CLI to test the Real-Time Analytics engine.

Example: Simulating High Stress

/bio hr=120 sleep=50 steps=200

System Response: The analytics engine fuses this high heart rate data with your conversation. If you then say "I feel anxious", the system triggers a Reactive Alert due to the correlation of physiological and verbal signals.


πŸ” Search & Research

Migru automatically detects when you need external facts.

Trigger Word: define

You: "Define prodrome phase."

Migru (Research Agent): ## Key Findings

  • The prodrome is the "pre-headache" phase, occurring hours or days before pain.
  • Symptoms include yawning, mood changes, and food cravings.
  • Recognizing it can allow for early intervention.

πŸ—οΈ Architecture Highlights

  • Local-First Privacy: Your conversation history and patterns are stored in your own local Redis instance, ensuring your wellness data stays private.
  • Adaptive Context: Automatically adjusts persona (calm vs. energetic) based on your detected mood.
  • Data Fusion: Correlates chat logs with simulated biometric streams using Pathway.
  • Redis Pipelining: Atomic, low-latency updates for real-time pattern tracking.
  • Dynamic Routing: Intelligently switches between "Fast" (Cerebras) and "Smart" (Mistral) models based on query complexity.

πŸ‘‰ Read the End-to-End Architecture Deep Dive

For deep performance tuning, see PERFORMANCE.md.


πŸ“š Documentation

About

a warm, ai agent built to support you through migraines and stress with empathy and research-backed relief strategies

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •