This project focuses on recognizing Indian Sign Language (ISL) gestures using video input. It leverages computer vision to extract hand landmarks from video frames and utilizes a Long Short-Term Memory (LSTM) model to classify gestures. The system is designed to aid communication for individuals with hearing and speech impairments.
- Real-time Video Processing: Captures video input to recognize ISL gestures.
- Landmark Extraction: Uses MediaPipe or OpenPose to extract hand landmarks.
- Gesture Recognition: LSTM-based model for sequence classification.
- Scalability: Supports training with an extended set of ISL gestures.
- User-Friendly Interface: Can be integrated into applications for accessibility solutions.
- Video Capture: The system records a video of hand movements.
- Landmark Detection: Extracts hand landmarks (X, Y, Z coordinates) from each frame.
- Feature Processing: Normalizes and processes the extracted coordinates.
- Sequence Classification: Uses an LSTM model to classify gestures.
- Prediction Output: Displays or converts recognized gestures into text or speech.
- Python (OpenCV, NumPy, Pandas)
- MediaPipe (for hand landmark detection)
- TensorFlow/Keras (for LSTM model training and prediction)
- Matplotlib & Seaborn (for data visualization)
- Clone the repository:
git clone https://github.com/UnbeatableBann/Indian Sign Language.git cd Indian Sign Language
- Install dependencies:
pip install -r requirements.txt
- Run the application:
python main.py
- The model is trained on a dataset of ISL gestures.
- Each gesture video is converted into a sequence of landmark coordinates.
- The dataset includes common ISL gestures like greetings, alphabets, and numbers.
- Preprocess Data: Extract landmarks from videos and store them in CSV or NumPy arrays.
- Train LSTM Model:
python train.py
- Evaluate Model: Assess accuracy using test datasets.
- Expand gesture vocabulary for better recognition.
- Implement a real-time translation system.
- Improve accuracy with attention-based neural networks.
- Develop a mobile application for on-the-go recognition.
- Your Name (@yourgithub)
This project is licensed under the MIT License.
- Inspired by research on sign language recognition.
- Thanks to the open-source community for resources and tools.