⚠️ Disclaimer / Important NoticeThis repository contains a personal hobby project and early-stage development prototype.
- This is NOT a medical device.
- This project is NOT safety-certified.
- It is NOT intended for diagnosis, treatment, mitigation, or prevention of any medical, psychological, or health condition.
- You use, modify, build, or distribute this project entirely at your own risk.
The author provides this project "AS IS", without warranty of any kind, express or implied, including but not limited to fitness for a particular purpose. In no event shall the author be held liable for any claim, damage, injury, loss of data, or legal consequence arising from the use of this project.
This repository exists strictly for learning, experimentation, and research purposes.
HabiZap is an experimental wearable wristband project focused on habit interruption using on-device machine learning.
Instead of passively tracking metrics or showing notifications, HabiZap attempts to:
- Detect repetitive or harmful hand gestures (e.g. nail biting, hair pulling)
- Classify them locally using ML inference
- Respond immediately with haptic feedback to interrupt the habit loop
The project intentionally avoids displays and cloud dependencies to remain:
- Minimal
- Private
- Power-efficient
- Hackable
-
3D prototype design: Special thanks to MaryBR for designing the early 3D enclosure prototype.
- LinkedIn: https://www.linkedin.com/in/mary-br/
The STL file will be shared soon in this repository.
- Prototype of with the XIAO ESP32S3 + MPU6050 + vibration motor
- Basic ML integration with Edge Impulse
- Basic gesture detection and haptic feedback
- Battery voltage monitoring
- Design a custom PCB and small case for it
- Custom App for the wristband
- Auto train the ML without manual Edge-Impulse training
ESP32-S3 MCU, MPU6050 IMU, vibration motor module, and single-cell Li‑Po battery
- Wristband-first: small, lightweight, unobtrusive
- No display: reduces distraction and complexity
- Local-first ML: inference happens on-device
- No mandatory internet: no always-on cloud
- Prototype-friendly: optimized for iteration, not polish
The project uses XIAO ESP32S3 as the main MCU due to:
- Integrated single-cell Li‑Po battery charger
- Sufficient compute for ML inference
- Vector instructions useful for ML workloads
- Very small physical footprint
Reference: https://wiki.seeedstudio.com/xiao_esp32s3_getting_started/
The prototype uses an MPU‑6050 module for motion data:
- 3‑axis accelerometer
- 3‑axis gyroscope
- I²C interface
- 1024‑byte FIFO buffer
Connections:
- SDA → ESP32 SDA
- SCL → ESP32 SCL
- VCC → 3.3V
- GND → GND
- AD0 → GND (I²C address 0x68)
Unused pins:
- INT, XDA, XCL
A vibration motor module provides haptic feedback when a target gesture is detected.
Typical specs:
- Operating voltage: 3.0 – 5.3 V
- Rated current: ~60 mA
- Startup current: ~90 mA
⚠️ The vibration motor is powered directly from the battery, not from the ESP32 regulator.
The module already includes a MOSFET and required driver components.
Pins:
- VCC → Battery +
- IN → ESP32 PWM pin
- GND → Ground
- Single‑cell Li‑Po battery
- Capacity: ~150–700 mAh
Charging is handled by the XIAO ESP32S3 internal charger.
Because there is no integrated fuel gauge, battery voltage is measured using a simple ADC voltage divider:
- R1: 47kΩ
- R2: 100kΩ
Reference: https://wiki.seeedstudio.com/check_battery_voltage/
| ID | Component | Value | Purpose |
|---|---|---|---|
| R1 | Resistor | 47kΩ | Battery voltage divider |
| R2 | Resistor | 100kΩ | Battery voltage divider |
| – | MPU6050 | GY-521 | Motion sensing |
| – | Vibration motor | Module | Haptic feedback |
| – | Li-Po battery | 150–700 mAh | Power source |
Prototype-level schematic (power, IMU, vibration motor, and ADC voltage divider)
- Framework: ESP‑IDF
- OS: FreeRTOS
- Language: C (core firmware)
- ML inference: C/C++ code generated by Edge Impulse
Firmware responsibilities:
- Sensor sampling and buffering
- Task scheduling via FreeRTOS
- ML inference execution
- Gesture classification
- Haptic feedback control
- Optional streaming for training
The project uses Kconfig (via ESP‑IDF) to configure build‑time behavior.
Key configuration options include:
-
Training Mode Enable
- Enables raw sensor data streaming for ML data collection
- Disables inference‑only optimizations
-
Sampling Configuration
- Motion sensor sample rate
- Buffer sizes
-
Debug / Logging Options
- Serial logging level
- Development diagnostics
You can access the configuration menu using:
idf.py menuconfig
⚠️ Make sure to disable training mode for normal inference usage to reduce power consumption.
- Prototype stage
- Off‑the‑shelf modules
- No custom enclosure yet
Future revisions will introduce:
- Custom PCB
- Smaller footprint
- Wristband‑friendly enclosure
🚧 Not implemented yet
Planned responsibilities:
- Device configuration
- Training session control
- Gesture labeling
- Sensitivity tuning
- OTA firmware updates
The current ML workflow relies on Edge Impulse for data collection and model training.
- Edge Impulse account
- Node.js
- ESP‑IDF toolchain installed
- Create a new Edge Impulse project
- Install the Edge Impulse CLI:
npm install -g edge-impulse-cli- Enable training mode in firmware:
idf.py menuconfigEnable the HabiZap training option.
- Build and flash the firmware
- Close the serial monitor
- Run the data forwarder:
edge-impulse-data-forwarder- When prompted for sensor labels, enter:
ax,ay,az,gx,gy,gz
- Collect data via the Edge Impulse dashboard
After training and deploying your model from Edge Impulse:
- Download the C/C++ library generated by Edge Impulse
- Extract and copy the generated files into the root of this project
- Ensure the Edge Impulse source files are included in the build
- Rebuild the firmware
⚠️ This step is required. The project does not fetch Edge Impulse artifacts automatically.
🚧 Active experimental prototype
Expect:
- Breaking changes
- Incomplete features
- Hardware revisions
This project is released for personal, educational, and experimental use.
Refer to the LICENSE file for details.
HabiZap is an exploration of:
- Embedded ML
- Wearable hardware design
- Gesture recognition
- Minimalist product thinking
If you build upon this project, please do so responsibly and ethically.




