A music-reactive dancing robot built on Raspberry Pi that dances to the beat of your music with LED matrix eyes that express emotions.
This project started as a hackathon project at the SoTa (State of the Art) event with Cocoa called "The Robot Rave". What began as an experimental idea to make a robot dance to music has evolved into a full-featured music-reactive robot with personality!
- Music-Reactive Dancing: Real-time beat detection and BPM analysis to sync robot movements with music
- LED Matrix Eyes: Freenove 8x16 LED matrix displays expressive eye animations that react to music and movement
- Multiple Dance Patterns: Various dance styles including bounce, sway, shuffle, pump, and more
- Autonomous Mode: Robot listens for music and automatically starts/stops dancing
- Web Interface: Beautiful responsive control panel accessible from any device on the network
- Audio Visualization: Real-time waveform and spectrum analyzer display
- Smart Silence Detection: Robot stops dancing within ~500ms when music stops
- 3D Gaussian Splat Viewer: Immersive WebGL visualization of Ravitto with 50k+ particles
Experience Ravitto in a stunning 3D particle cloud visualization! The Gaussian splat viewer renders the robot as thousands of floating particles with realistic colors and lighting.
Features:
- 50,000+ Particles: High-fidelity point cloud forming Ravitto's shape
- Orbital Controls: Drag to rotate, scroll to zoom, double-click to reset
- Quality Levels: 1x (25k) to 8x (200k) particles for any device
- Color Modes: Normal, Neon, Thermal, and Golden rendering styles
- Explode Effect: Watch particles burst outward and reform
- Auto-Rotate: Cinematic orbiting camera animation
Browser Requirements:
- Modern browser with WebGL support (Chrome, Firefox, Safari, Edge)
- Hardware acceleration enabled
- For best performance: dedicated GPU recommended for 4x/8x quality modes
Access at /splat when running the backend, or view the live demo.
Troubleshooting:
- If the viewer shows a black screen, check that WebGL is enabled in your browser
- For slow performance, try reducing quality to 1x or 2x mode
- On mobile devices, start with 1x quality for smoother experience
- Raspberry Pi 4 (or 3B+)
- CamJam EduKit 3 (or compatible motor driver)
- 2x DC Motors with wheels
- USB Microphone
- Freenove 8x16 LED Matrix (HT16K33) - Optional but recommended
- Power supply for motors
| Motor | GPIO Pin 1 | GPIO Pin 2 |
|---|---|---|
| Left | 9 | 10 |
| Right | 7 | 8 |
| Matrix Pin | Pi Pin |
|---|---|
| VCC | 3.3V |
| GND | GND |
| SDA | SDA |
| SCL | SCL |
| LED | GPIO Pin |
|---|---|
| Status | 25 |
sudo raspi-config
# Navigate to: Interface Options > I2C > Enablesudo apt-get update
sudo apt-get install -y portaudio19-dev i2c-toolspip3 install flask numpy sounddevice scipy --break-system-packages
pip3 install adafruit-circuitpython-ht16k33 --break-system-packagesi2cdetect -y 1
# Should show devices at 0x70 and 0x71 for the LED matrixcd /path/to/robot-rave
python3 robot_backend.pyOpen a browser and navigate to:
- From the Pi:
http://localhost:5000 - From another device:
http://<pi-ip-address>:5000
- START AUTONOMOUS MODE: Robot listens for music and dances automatically
- Manual Controls: D-pad for direct motor control
- Sensitivity Slider: Adjust music detection sensitivity
- Gain Slider: Adjust microphone input gain
- EMERGENCY STOP: Immediately stops all movement
robot-rave/
├── robot_backend.py # Main backend server with all logic
├── robot_frontend.html # Web interface (served by Flask)
├── ravitto_studio.html # Interactive 360° photo viewer
├── ravitto_splat.html # 3D Gaussian splat WebGL viewer
├── ravitto_360.gif # Robot showcase animation
├── images/ # Robot photos
├── README.md # This file
├── CLAUDE.md # Claude Code guidelines
└── AGENT.md # AI assistant guidelines
| Endpoint | Method | Description |
|---|---|---|
/ |
GET | Serves the web frontend |
/splat |
GET | 3D Gaussian splat viewer |
/studio |
GET | Interactive 360° photo studio |
/api/status |
GET | Returns current robot state as JSON |
/api/control/<cmd> |
POST | Send control commands (forward, backward, left, right, stop, toggle_auto) |
/api/sens/<val> |
POST | Set sensitivity (0-100) |
/api/gain/<val> |
POST | Set microphone gain (0-100) |
/api/eyes/<expression> |
POST | Set eye expression |
/api/eyes/special/<type> |
POST | Trigger special eye animation |
The LED matrix eyes support multiple expressions:
normal- Default relaxed eyeshappy- Curved happy eyesexcited- Wide open eyessleepy- Droopy tired eyesangry- Furrowed browlook_left,look_right,look_up- Directional looking
Special animations: heart, star, dizzy
- Audio Capture: USB microphone captures ambient audio at 44.1kHz
- Feature Extraction: Analyzes RMS energy, spectral features, and frequency bands
- Beat Detection: Identifies beats using onset detection and tempo estimation
- Music Classification: Distinguishes music from silence, noise, and speech
- Dance Engine: Selects appropriate dance patterns based on BPM, energy, and dominant frequency band
- Motor Control: Executes dance moves through PWM-controlled DC motors
- Eye Animation: Updates LED matrix based on energy, beats, and movement direction
Contributions are welcome! Feel free to submit issues and pull requests.
MIT License - Feel free to use, modify, and distribute.
- SoTa Event & Cocoa - For hosting the hackathon that sparked this project
- CamJam EduKit - For the excellent robotics kit
- Adafruit - For the HT16K33 LED matrix library
- All the hackathon participants who cheered on the dancing robot!
Made with music and motors at The Robot Rave hackathon