You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We help the deaf and the dumb to communicate with normal people using hand gesture to speech conversion. In this code we use depth maps from the kinect camera and techniques like convex hull + contour mapping to recognise 5 hand signs
A real-time hand sign language detection system built using OpenCV, Flask, and Deep Learning. The system captures live webcam feed, detects hand gestures using the cvzone HandTrackingModule, and classifies the detected signs using a custom-trained Keras model.
Hello friends, I am making a Machine Learning repo. where I will upload several datasets and its solution with explanation. Starting from the basic and moving up in difficulty level.
This is a demonstration of hand pose recognition implemented using a Flask backend and the Indian Sign Language Translator API. It is now hosted on an AWS instance at http://18.236.194.220:5000/
A deep learning model that accurately classifies hand sign images representing digits 0-9 using convolutional neural networks. Achieves over 95% accuracy on the Sign Language Digits Dataset.