Trained YOLO11n-pose model on hand keypoints
-
Updated
Nov 14, 2024 - Python
Trained YOLO11n-pose model on hand keypoints
Augmented reality application that creates artificial lights on the finger tips of hands.
Grip is a functional robotic hand controlled by muscle activity and/or neural signals. Using non-invasive sensors (such as surface EMG), the system translates user intent into precise hand movements. The project combines mechanical design, embedded systems, and AI-driven gesture recognition for intuitive and natural control
Add a description, image, and links to the hand-keypoint-detection topic page so that developers can more easily learn about it.
To associate your repository with the hand-keypoint-detection topic, visit your repo's landing page and select "manage topics."