We have tried to develop a proof of concept for the following:
- Virtual Doodle
- Self Checkout
- Classroom
 Virtual Doodle
Virtual Doodle
 Grocery Self Checkout
Grocery Self Checkout
The user needs to move his hand just like a wand of a magician, to transform his imagination into real artwork. The doodle can also be saved in thein doodle gallery afterward.
This tool offers touch-free navigation for teachers using common desktops to deliver lectures in the classroom.
The self-checkout provides a touch-free navigation and form-filling option using speech and gesture recognition.
- Python
- OpenCV
- Mediapipe
- Django
- HTML/CSS/BootStrap
- AutoPy
- Speech Recognition
Django
$ pip install djangoSpeech Recognition
$ pip install Speech RecognitionMediaPipe
$ pip install mediapipePyAudio
$ pip install PyAudioAutopy
$ pip install autopyOpenCV
$ pip install opencv-pythonThe first thing to do is to clone the repository:
$ git clone https://github.com/S-JZ/TouchMeNot.gitCreate a virtual environment to install dependencies in and activate it:
$ virtualenv --no-site-packages env
$ source env/bin/activateThen install the dependencies:
(env)$ pip install -r requirements.txtNote the (env) in front of the prompt. This indicates that this terminal
session operates in a virtual environment set up by virtualenv.
Once pip has finished downloading the dependencies:
(env)$ cd TouchMeNot
(env)$ python manage.py runserverAnd navigate to http://127.0.0.1:8000/.
