A computer vision-powered virtual whiteboard that enables drawing in mid-air using gesture recognition and object tracking.
Virtual Whiteboard transforms any webcam-equipped computer into an interactive drawing surface by detecting hand movements and a designated drawing object. The system tracks when your finger touches the object and renders strokes on a virtual canvas, creating a natural drawing experience without specialized hardware.
- Dual-hand Operation: Designated drawing and control hands with distinct functions
- Object Recognition: Calibration system detects any physical object as a drawing tool
- Touch Detection: Precise tracking of finger-to-object contact points
- Gesture Controls:
- Index finger gesture for color selection
- Thumb gesture for stroke width adjustment
- Whiteboard Modes: Toggle between camera overlay and full whiteboard views
- Writing Recognition: Optional auto-correction of handwritten characters
- Export Options: Save drawings as image files
- Python 3.10.x
- OpenCV 4.5+
- NumPy 1.20+
- MediaPipe 0.8+
# Clone the repository
git clone https://github.com/barandev/virtual-whiteboard.git
cd virtual-whiteboard
# Create a virtual environment (optional but recommended)
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
# Install dependencies
pip install -r requirements.txtpython virtual_whiteboard.pyOn first run, the system will guide you through:
- Hand registration (drawing vs. control hand)
- Object calibration for drawing tool detection
The system requires identification of your drawing and control hands:
- Drawing Hand: Holds the object and draws with the index finger
- Control Hand: Controls color/size selection via gestures
Any object with a distinct color can be used as a drawing tool:
- Position object in the calibration frame
- System samples its color profile for reliable detection
- Default drawing color automatically matches the object
| Action | Description |
|---|---|
| Touch object with index finger | Begin drawing |
| Release finger from object | Stop drawing |
| Raise index finger (control hand) | Activate color selector |
| Raise thumb (control hand) | Activate size selector |
| Move control hand left/right | Change selected value |
| Key | Function |
|---|---|
| w | Toggle whiteboard mode |
| c | Clear canvas |
| s | Save drawing |
| e | Toggle eraser mode |
| r | Toggle writing recognition |
| t | Adjust touch threshold |
| b | Toggle debug display |
| h | Toggle help overlay |
| q | Quit application |
| 1-8 | Select predefined colors |
| 0 | Use calibrated object color |
| +/- | Manual size adjustment |
The application employs two primary detection methods:
- MediaPipe Hands: Tracks hand landmarks and finger positions
- HSV Color Thresholding: Identifies the calibrated drawing object
Hand gestures are detected using landmark relationships:
- Index Finger Selection: Extended index finger with other fingers curled
- Thumb Selection: Extended thumb with other fingers curled
Touch is registered when the distance between finger and object tips falls below a configurable threshold, with a stability filter to prevent jitter.
The optional writing recognition system:
- Tracks completed strokes
- Normalizes stroke geometry
- Compares against character templates
- Renders clean characters when matches are found
virtual-whiteboard/
├── virtual_whiteboard.py # Main application
├── requirements.txt # Dependencies
├── README.md # Documentation
└── whiteboard_captures/ # Saved drawings
Contributions are welcome! Please feel free to submit a Pull Request.
- Fork the repository
- Create your feature branch (
git checkout -b feature/X-feature) - Commit your changes (
git commit -m 'Add X feature') - Push to the branch (
git push origin feature/X-feature) - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
- OpenCV community for computer vision tools
- MediaPipe team for hand tracking solutions