In this project, we develop a computing system that can understand human gestures. In specific, we attempt to recognise ASL signs for the following words:
- ABOUT https://www.signingsavvy.com/sign/ABOUT/32/1
- AND https://www.signingsavvy.com/sign/AND/888/1
- CAN https://www.signingsavvy.com/sign/CAN/524/1
- COP https://www.signingsavvy.com/sign/COP/3203/1
- DEAF https://www.signingsavvy.com/sign/DEAF/102/1
- DECIDE https://www.signingsavvy.com/sign/DECIDE/781/1
- FATHER https://www.signingsavvy.com/sign/FATHER/3440/1
- FIND https://www.signingsavvy.com/sign/FIND/146/1
- GO OUT https://www.signingsavvy.com/sign/GO+OUT
- HEARING https://www.signingsavvy.com/sign/HEARING/3594/1
Phase 1: Data collection
Phase 2: Feature Extraction, PCA
Phase 3: Training SVM, Decision Tree and Neural Networks, User Dependent Analysis
Phase 4: User Independent Analysis
For more details, please check the specification present under each directory.