The included codes are for the combination of two subsystems. The first system is a smart glove that translates Arabic Sign Language into spoken Arabic. The second system is a 3D-printed humanlike arm that converts recognized Arabic speech to Arabic Sign Language on the fingers using servo motors. The glove is a sensor-based system using 5 flex sensors, an accelerometer, and 2 touch sensors. The codes used for data collecting, training the CNN, prediction, and the complete glove code are attached. The results, weights, and model for the used CNN with accuracy 98.28% are also attached. The used dataset is also added.
For the arm part, the codes used for both the raspberry pi and the arduino to recognize Arabic speech and operate the servo motors are attached.