Audio-controlled gaming experiment | ๐ฎ Live Demo | ๐งช Audio Tools | ๐ฆ GitHub
From breath detection to clap control - exploring the future of biological computing interfaces. Your microphone becomes the controller.
Breath Quest started as a breath-controlled gaming experiment and evolved into a comprehensive audio interface research platform. The project validates audio-controlled gaming interfaces through three working prototypes, from breath detection to clap-controlled gaming.
- ๐ Clap-Controlled Gaming - Single clap to jump, double clap to shoot, triple clap for special powers
- ๐ซ Advanced Breath Detection - Multi-feature fusion algorithms for precise breathing pattern recognition
- ๐ Real-time Audio Analysis - 7+ visualization tools for understanding audio signatures
- ๐ฏ Personal Calibration - Adaptive systems that learn your unique patterns
- ๐ฅ Research Tools - Record and analyze sessions for interface validation
- ๐ Browser-Based - No installation required, works on any device with a microphone
Original Hypothesis: Breath-controlled gaming could create engaging, wellness-focused experiences. Discovery: While breath detection was technically achievable, clap detection proved far more responsive and satisfying for gaming. Result: A hybrid platform that demonstrates both approaches, with clap gaming as the primary experience.
# Clone the repository
git clone https://github.com/alibad/breathquest.git
cd breathquest
# Install dependencies
npm install
# Start development server
npm run dev
# Open http://localhost:3000Visit breather.quest to experience audio-controlled gaming instantly.
This project validates 3 core hypotheses about audio-controlled interfaces:
Consumer microphones can reliably detect breathing patterns with sufficient accuracy for real-time gaming
Status: ๐ฏ VALIDATED - All 3 phases complete:
- Phase 1: Basic RMS breath detection
- Phase 2: Research-enhanced algorithms (multi-feature fusion)
- Phase 3: Personal calibration system with breathing profiles
Audio-controlled gameplay (clap detection) is significantly more engaging than breath control for gaming
Status: ๐ฏ VALIDATED - Clap detection provides:
- Instant responsiveness (<16ms latency)
- Natural gaming gestures (clap patterns map to game actions)
- Zero calibration required
- Universal device compatibility
A polished clap-controlled runner game with onboarding, scoring, lives, and game polish can be engaging
Status: ๐ฏ VALIDATED - ClapQuest features:
- Complete game loop with onboarding tutorial
- Scoring system with multipliers and high scores
- Lives system and game over mechanics
- Visual polish with particles and animations
Microphone Input โ Web Audio API โ Feature Analysis โ Pattern Recognition โ Game Controls
โ โ
Breath Features: RMS, Spectral Clap Detection: Amplitude Spikes,
Centroid, Zero Crossing Rate, Zero Crossings, Pattern Matching
Frequency Bands, Envelope, LPC
Breath Detection:
- Multi-feature fusion combining 6+ audio characteristics
- Personal calibration for individual breathing patterns
- Noise filtering and confidence scoring
- <100ms latency with high accuracy
Clap Detection:
- High-amplitude spike detection with zero-crossing analysis
- Pattern matching for single/double/triple clap sequences
- Refractory period to prevent false triggers
- <16ms latency with instant feedback
- โฑ๏ธ Time Domain Analysis - Raw waveform visualization with zero crossing detection
- ๐ต Frequency Domain Analysis - FFT with spectral centroid calculation
- ๐ Amplitude Envelope Analysis - Hilbert transform and peak follower algorithms
- ๐ Multi-Band Frequency Analysis - 8-band energy distribution monitoring
- ๐ Clap Detection Visualizer - Real-time clap pattern recognition display
- ๐ซ Breath Detection Meter - Multi-feature breath analysis with confidence scoring
- ๐ฅ Video Recording - Capture analysis sessions for research and validation
| Technology | Purpose | Implementation |
|---|---|---|
| Next.js 15 | Framework | App Router, Server Components |
| Web Audio API | Audio Processing | Real-time microphone analysis |
| TypeScript | Type Safety | Full type coverage |
| Canvas API | Visualizations | Real-time audio waveforms |
| Local Storage | Calibration Data | Personal breathing profiles |
- HCI Research: Novel interface design patterns
- Audio Processing: Breath detection algorithm validation
- Health Tech: Non-invasive breathing monitoring
- Game Design: Biometric input methods
- Wellness Apps: Breathing exercise gamification
- Accessibility: Voice-free computer control
- VR/AR: Natural breathing as input modality
- IoT Health: Ambient breathing monitoring
We welcome contributions!
# Development workflow
1. Fork the repository
2. Create feature branch: git checkout -b feature/amazing-feature
3. Commit changes: git commit -m 'Add amazing feature'
4. Push to branch: git push origin feature/amazing-feature
5. Open a Pull Request| Resource | Description | Link |
|---|---|---|
| Live Demo | Interactive breath gaming | breather.quest |
| Audio Tools | Real-time analysis suite | breather.quest/audio-tools |
| Hypothesis 1 | Technical validation | breather.quest/hypothesis-1 |
| Research Docs | Academic findings | docs/hypothesis/ |
"If we're building AGI that understands humans deeply, shouldn't our interfaces reflect human biology? Breathing is universal, involuntary yet controllable, calming yet energizing. It's the perfect bridge between mind and machine."
- Stress-aware AI that adapts to your breathing
- Health-improving interfaces that make you calmer by using them
- Natural control systems based on involuntary biological signals
- Embodied AI interaction that feels human, not mechanical
This project is licensed under the MIT License - see the LICENSE file for details.
Built with passion for the future of human-AI interaction
- ๐ Website: breather.quest
- ๐ผ LinkedIn: Connect with the creator
- ๐ GitHub: @alibad
โญ Star this repo if you believe in biological computing interfaces!