Skip to content

theoclark/sign-language

Repository files navigation

Sign-Language Detection App

App to read the American Sign Language alphabet. For more information, read the blog post.

example

Usage

To use the app, download or clone the repository. First install dependencies by running: pip install -r requirements.txt and then run python app.py. The model works best against a clear background. The hand must be inside the box.

American Sign Language alphabet:

alphabet

Training

Use the train.ipynb notebook to train a model from scratch.

A similar training set of 28x28 grayscale images is available via Kaggle and this was used initially for training. However, the lack of variation within the dataset meant that the model didn't generalise well to different backgrounds and situations.

I created a new dataset of 256x256 colour images containing 3279 training images and 1069 test images. The images contain a variety of backgrounds, lighting, hand positions and angles. The dataset can be downloaded here.

The model is a 12 block ConvNet built using PyTorch. Each block consists of a 3x3 2D convolution layer, a Tanh activation layer and a batch-norm layer. The first convolution is 5x5. The channels dimensions are: 6, 16, 120, 120, 120, 120, 120, 120, 120, 120, 64, 26. For more details, see the source code.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published