This project implements a virtual mouse using hand gestures via your webcam. It uses MediaPipe for real-time hand tracking and PyAutoGUI to control the mouse, scroll, and click β all with your fingers. Ideal for touchless control or fun automation.
- Real-time hand tracking using MediaPipe
- Control mouse cursor with your index finger
- Scroll up/down with two fingers or fist gestures
- Perform mouse click with all fingers extended
- Exit using the βHang Looseβ gesture π€ (thumb and pinky extended)
- Uses gesture-based UI β no physical mouse required
| Library | Purpose | 
|---|---|
| cv2(OpenCV) | Captures video from webcam & visualizes output | 
| mediapipe | Detects and tracks hand landmarks | 
| pyautogui | Controls the mouse and keyboard | 
| time | Manages delays and debounce functionality | 
- MediaPipe: Lightweight and efficient ML framework by Google for hand/pose tracking.
- PyAutoGUI: Easy-to-use Python library to control mouse and keyboard events.
- OpenCV: For accessing camera feed and overlaying visual cues (like finger landmarks).
- Time: Prevents rapid/unintended repeated actions (debounce logic).
git clone https://github.com/yourusername/virtual-mouse-gestures.git
cd virtual-mouse-gesturespython -m venv venv
venv\Scripts\activate   # On Windowspip install -r requirements.txt
requirements.txtshould contain:
opencv-python
mediapipe
pyautoguipython your_script_name.py| Gesture | Action | 
|---|---|
| βοΈ Index finger up | Move mouse | 
| βοΈ Index + middle finger | Scroll up | 
| β Fist | Scroll down | 
| ποΈ All fingers extended | Click | 
| π€ Hang Loose (thumb + pinky) | Exit program | 
- 
Works best in well-lit environments. 
- 
Webcam resolution and frame rate affect performance. 
- 
Smoothing and debounce can be fine-tuned via constants: - SMOOTHING_FACTOR
- DEBOUNCE_TIME
 
- Add gesture for right-click or drag.
- Include volume control or media gestures.
- Show on-screen cursor overlay.
- Use Kalman filter or AI-based stabilization for smoother movement.