Control your Linux workspace with a flick of your wrist. Powered by Google's MediaPipe for high-performance, low-latency tracking.
Air Gesture turns your webcam into a touchless controller. It tracks your index finger and identifies rapid "swipes" to trigger system commands.
Your config lives here: ~/.config/air-gesture/config.toml
[settings]
threshold = 150 # Sensitivity (lower = easier to trigger)
history_size = 10 # Motion smoothness (higher = more stable)
[gestures]
SWIPE_LEFT = "qdbus org.kde.kglobalaccel /component/kwin invokeShortcut 'Switch to Previous Desktop'"
SWIPE_RIGHT = "qdbus org.kde.kglobalaccel /component/kwin invokeShortcut 'Switch to Next Desktop'"
| Option | What it does |
|---|---|
| threshold | Distance your finger must travel to count as a "swipe." |
| history_size | The number of frames tracked to verify a gesture. |
| gestures | Map any swipe to a terminal command or script. |
Linux
- Clone & Enter
git clone https://github.com/your-username/air-gesture.git
cd air-gesture
- Run Installer
python scripts/setup.py
Note: This script fetches the hand_landmarker.task AI model (~7MB) from Google's API.
3. Launch
air-gesture --enable-camera
- X11 (.xinitrc): Add
air-gesture &to your file to start it with X11. - Wayland/DEs: Add
air-gestureto your Startup Applications list.
- Light it up: The tracker needs to see your hand clearly. Avoid dark rooms.
- Palm check: Keep your palm facing the camera. It helps the AI anchor your finger position.
- Clean view: Make sure no other apps are hogging your
/dev/video0.
- Engine: Google MediaPipe.
- Model: Pre-trained model usage is subject to Google's TOS.
made by annyman
