Migru is a local-first, personal, and private AI-powered companion designed to support you through migraines and stress with empathy and research-backed relief strategies. It combines ultra-fast responses with deep, personalized wisdom, while keeping your data under your control.
Migru is a warm, AI-powered companion designed to support users through migraines and stress. It prioritizes empathy, research-backed relief strategies, and ultra-fast performance. Now enhanced for the Kaggle Med-Gemma Impact Challenge using Google's Gemma 2.
- Primary Mission: To walk alongside users with a "wise, humble, and deeply curious" persona, helping them discover wellness patterns.
- Key Technologies:
- Language: Python 3.12+ (managed with
uv) - Agent Framework: Agno AI (formerly Phidata)
- AI Models:
- Local: Google Gemma 2 (9B) for clinical insights (Med-Gemma).
- Cloud Fallback: Mistral Small, Cerebras (llama3.1-8b).
- Privacy: 100% Local "Edge AI" mode available.
- Database/Memory: Redis (Conversation history, user profiles, and real-time pattern detection).
- Streaming Analytics: Pathway for low-latency pattern recognition.
- UI: Rich for a beautiful terminal-based interface.
- Language: Python 3.12+ (managed with
- π Privacy-First Design: Complete local AI processing option with FunctionGemma, Qwen2.5, and other models
- π§ Smart Agent Routing: Intelligent agent selection for optimal responses
- πΏ Empathetic Conversations: Therapeutic support optimized for wellness
- β‘ Ultra-Fast Responses: Local inference eliminates network latency (1-3 seconds with Cerebras)
- π Optional Web Search: Privacy-aware research tools only when you need them
- π Real-time Analytics: Pattern detection and wellness insights with Pathway
- π¨ Beautiful CLI: Rich themes and accessibility features
- πΎ Local-First Storage: Your data stays on your device with Redis
Clean, calming interface designed for wellness and relief
- Python 3.12+
uvpackage manager- Redis server (local)
- Ollama (for local models) ->
ollama pull gemma2:9b
- Install (Global):
uv tool install -e . --python 3.12(local dev) oruv tool install migru --python 3.12(published) - Run (Therapeutic):
migruoruv run -m app.main - Run (Work Mode):
migru --work - Medical Analysis: Use
/medcommand in chat. - Run (Source):
uv run -m app.main - Test:
pytest(Configured inpyproject.toml) - Lint:
ruff check . - Type Check:
mypy app/
The codebase follows best practices:
- Type Hints: Comprehensive type annotations
- Error Handling: Graceful degradation and user-friendly messages
- Logging: Structured logging for debugging
- Modularity: Clear separation of concerns
- Documentation: Comprehensive docstrings and READMEs
- Lazy Loading: Heavy dependencies loaded only when needed
- Service Caching: Frequently accessed services are cached
- Optimized Imports: Organized imports to avoid circular dependencies
- Memory Monitoring: Memory usage tracking and optimization
Each module includes comprehensive documentation:
app/cli/README.md- CLI module documentationapp/services/- Service layer documentationapp/agents.py- Agent architecture documentationapp/config.py- Configuration documentation
- Python 3.12+
- Redis (local)
uvpackage manager
You can install Migru as a global command line tool directly:
# If running from the source directory (Development)
# Use editable mode and pin Python 3.12 for binary compatibility
uv tool install -e . --python 3.12
# Or install globally without cloning (once published)
uv tool install migru --python 3.12Then simply run:
migruCreate a .env file in your working directory with your API keys:
# Cloud AI Providers (Default Mode)
MISTRAL_API_KEY=... # Primary Intelligence
CEREBRAS_API_KEY=... # Ultra-Fast Responses (Recommended)
OPENROUTER_API_KEY=... # Fallback Provider
# Optional: Web Search & Weather
FIRECRAWL_API_KEY=... # Deep Research
OPENWEATHER_API_KEY=... # Environmental Context
# Optional: Local LLM (See Local LLM Support section)
LOCAL_LLM_ENABLED=false
LOCAL_LLM_HOST=http://localhost:8080
PRIVACY_MODE=cloudMigru is primarily a CLI application.
If installed globally:
migruOr run from source:
# Standard mode (cloud AI)
uv run -m app.main
# Enhanced mode (with local LLM support)
uv run -m app.main_enhancedStart with a specific user context (loads your personal history/patterns):
migru --user "Alex"Launch with high-contrast UI and reduced motion/animations:
migru --accessible| Flag | Short | Description |
|---|---|---|
--user <name> |
-u |
Sets the active user profile name (Default: "Friend") |
--accessible |
-a |
Enables high-contrast, reduced-motion UI |
--quiet |
-q |
Suppresses startup banner and welcome messages |
--verbose |
-v |
Shows detailed performance logs and debug info |
Once inside the chat, use these slash commands to interact with the system:
| Command | Description | Example Output |
|---|---|---|
/profile |
View your learned preferences & bio context | Work: Remote, Sensitivities: Light |
/patterns |
See discovered wellness rhythms | Peak Symptom Hour: 10:00 AM |
/bio <args> |
Simulate biometric data input | /bio hr=110 sleep=60 |
/model |
Switch AI models dynamically | Switched to Mistral AI |
/history |
View recent conversation memories | Last topic: Magnesium for relief |
/clear |
Clear the terminal screen | (Clears screen) |
/exit |
End the session gracefully | (Saves state and exits) |
| Command | Description | Example |
|---|---|---|
/privacy status |
Check current privacy settings | Shows local/hybrid/cloud mode |
/privacy local |
Switch to 100% private mode | Disables all external APIs |
/privacy hybrid |
Local AI + optional search | Balance privacy and features |
/local status |
Show current local model | Using Qwen2.5:3B |
/local models |
List available local models | Shows all downloaded models |
/local switch <model> |
Switch local models | /local switch qwen2.5:3b |
/local test |
Test local LLM connection | Verifies server is running |
You can simulate wearable/sensor data directly from the CLI to test the Real-Time Analytics engine.
Example: Simulating High Stress
/bio hr=120 sleep=50 steps=200
System Response: The analytics engine fuses this high heart rate data with your conversation. If you then say "I feel anxious", the system triggers a Reactive Alert due to the correlation of physiological and verbal signals.
Migru automatically detects when you need external facts.
Trigger Word: define
You: "Define prodrome phase."
Migru (Research Agent): ## Key Findings
- The prodrome is the "pre-headache" phase, occurring hours or days before pain.
- Symptoms include yawning, mood changes, and food cravings.
- Recognizing it can allow for early intervention.
- Local-First Privacy: Your conversation history and patterns are stored in your own local Redis instance, ensuring your wellness data stays private.
- Adaptive Context: Automatically adjusts persona (calm vs. energetic) based on your detected mood.
- Data Fusion: Correlates chat logs with simulated biometric streams using Pathway.
- Redis Pipelining: Atomic, low-latency updates for real-time pattern tracking.
- Dynamic Routing: Intelligently switches between "Fast" (Cerebras) and "Smart" (Mistral) models based on query complexity.
π Read the End-to-End Architecture Deep Dive
For deep performance tuning, see PERFORMANCE.md.
- HOW_IT_WORKS.md - Detailed explanation of Pathway integration and data flow
- PERFORMANCE.md - Performance optimization guide
- AGENTS.md - Agent development guidelines
- .env.example - Configuration template




