A Master of Science thesis in Computer Engineering by Noor Ali Tubaiz entitled, "Sensor-based Continuous Arabic Sign Language Recognition," submitted in June 2014. Thesis advisor is Dr. Tamer Shanableh and thesis co-advisor is Dr. Khaled Assaleh. Available are both soft and hard copies of the thesis.
Arabic sign language is the most common way of communication between the deaf and the hearing individuals in the Arab world. Due to the lack of knowledge of Arabic sign language among the hearing society, deaf people tend to be isolated. Most of the research in this area is focused on the level of isolated gesture recognition using vision-based or sensor-based approaches. While few recognition systems were proposed for continuous Arabic sign language using vision-based methods, such systems require complex image processing and feature extraction techniques. Therefore, an automatic sensor-based continuous Arabic sign language recognition system is proposed in this thesis in an attempt to facilitate this kind of communication. In order to build this system, we created a dataset of 40 sentences using an 80-word lexicon. It is intended to make this dataset publicly available to the research community. In the dataset, hand movements and gestures are captured using two DG5-VHand data gloves. Next, as part of data labeling in supervised learning, a camera setup was used to synchronize hand gestures with their corresponding words. Having compiled the dataset, low-complexity preprocessing and feature extraction techniques are applied to eliminate the natural temporal dependency of the data. Subsequently, the system model was built using a low-complexity modified k-Nearest Neighbor (KNN) approach. The proposed technique achieved a sentence recognition rate of 98%. Finally, the results were compared in terms of complexity and recognition accuracy against sequential data systems that use common complex methods such as Nonlinear AutoRegressive eXogenous models (NARX) and Hidden Markov Models (HMMs).