Skip to content

๐Ÿšž Sign ๐Ÿš‚ Language ๐Ÿšƒ Recognition ๐Ÿš‹ Using ๐ŸšŒ MNIST ๐Ÿ›ผ Hand ๐Ÿš Gesture โœˆ Dataset ๐Ÿš is a deep ๐Ÿš€ learning ๐Ÿš  that โ›ด recognizes ๐Ÿšข hand ๐Ÿš’ gestures ๐Ÿšˆ using the ๐Ÿš˜ MNIST ๐Ÿซ Hand ๐Ÿ• Gesture ๐Ÿ•Œ Dataset ๐Ÿ  The ๐Ÿฉ project ๐Ÿฐ demonstrates ๐Ÿ› Convolutional โšฝ Neural โšพ Networks ๐ŸฅŽ can be ๐Ÿ€ applied ๐Ÿˆ to empower ๐ŸŽฐ communication ๐ŸŽฎ accessibility ๐ŸŽณ

Notifications You must be signed in to change notification settings

Hazrat-Ali9/Sign-Language-Recognition-Using-MNIST-Hand-Gesture-Dataset

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

1 Commit
ย 
ย 

Repository files navigation

โœ‹ Sign Language Recognition Using MNIST Hand Gesture Dataset ๐ŸคŸ๐Ÿ“Š

Sign-Language-Recognition-Using-MNIST-Hand-Gesture-Dataset is a computer vision and deep learning project that recognizes American Sign Language (ASL) hand gestures using the MNIST Hand Gesture Dataset. The project demonstrates how Convolutional Neural Networks (CNNs) can be applied to empower communication accessibility for the deaf and hard-of-hearing community.

โœจ Key Features

โœ‹ Gesture Recognition: Classify hand signs into ASL alphabet letters (Aโ€“Z, excluding J and Z due to motion)

๐Ÿ–ผ๏ธ Dataset: Trained on the MNIST Hand Gesture Dataset (image-based labeled hand signs)

๐Ÿงน Preprocessing: Grayscale normalization, resizing, and data augmentation for robustness

๐Ÿง  Deep CNN Models: Custom CNNs and pretrained models (VGG16, ResNet, MobileNet)

๐Ÿ“Š Performance Metrics: Accuracy, Precision, Recall, F1-score, Confusion Matrix

๐Ÿ“ˆ Visualization: Training/validation curves, heatmaps, and Grad-CAM to show model focus

๐ŸŒ Real-Time Demo: Deployable web app for live gesture recognition (via webcam)

๐Ÿงฐ Tech Stack

Programming: Python ๐Ÿ

Deep Learning: TensorFlow / Keras or PyTorch

Libraries: NumPy, Pandas, OpenCV, Matplotlib, Seaborn, Scikit-learn

Deployment (Optional): Flask, Streamlit, FastAPI

๐Ÿ“ Project Structure ๐Ÿ“ dataset/ # MNIST Hand Gesture Dataset ๐Ÿ“ preprocessing/ # Data cleaning & augmentation scripts ๐Ÿ“ models/ # CNN and pretrained architectures ๐Ÿ“ notebooks/ # Jupyter notebooks for training & evaluation ๐Ÿ“ results/ # Metrics, plots & Grad-CAM visualizations ๐Ÿ“ app/ # Web app for real-time sign recognition

๐Ÿš€ Getting Started git clone https://github.com/yourusername/Sign-Language-Recognition-Using-MNIST-Hand-Gesture-Dataset.git cd Sign-Language-Recognition-Using-MNIST-Hand-Gesture-Dataset pip install -r requirements.txt jupyter notebook

๐Ÿ“Œ Use Cases

๐Ÿง Accessibility: Helps bridge communication gaps for the deaf community

๐Ÿ“ฑ Applications: Integration into mobile apps for real-time sign recognition

๐Ÿซ Education: Assisting learners in practicing ASL alphabets

๐Ÿค– Research: Benchmark for applying CNNs in gesture and pattern recognition

๐Ÿค Contributing

Contributions are welcome! You can add more architectures, improve dataset preprocessing, or expand to full ASL word recognition.

๐Ÿ“œ License

MIT License โ€“ Free to use for research, learning, and open-source development.

โญ Support

If you find this project valuable, please give it a star โญ to support open-source work in AI for accessibility.

About

๐Ÿšž Sign ๐Ÿš‚ Language ๐Ÿšƒ Recognition ๐Ÿš‹ Using ๐ŸšŒ MNIST ๐Ÿ›ผ Hand ๐Ÿš Gesture โœˆ Dataset ๐Ÿš is a deep ๐Ÿš€ learning ๐Ÿš  that โ›ด recognizes ๐Ÿšข hand ๐Ÿš’ gestures ๐Ÿšˆ using the ๐Ÿš˜ MNIST ๐Ÿซ Hand ๐Ÿ• Gesture ๐Ÿ•Œ Dataset ๐Ÿ  The ๐Ÿฉ project ๐Ÿฐ demonstrates ๐Ÿ› Convolutional โšฝ Neural โšพ Networks ๐ŸฅŽ can be ๐Ÿ€ applied ๐Ÿˆ to empower ๐ŸŽฐ communication ๐ŸŽฎ accessibility ๐ŸŽณ

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published