Sign language maps letters, words, and expressions of a certain language to a set of hand gestures enabling an individual to communicate by using hands and gestures rather than by speaking. Systems capable of recognizing sign-language symbols can be used as a means of communication between Hearing-impaired and vocal people. This paper represents an attempt to recognize handed signs from the Unified Arabic Sign Language Dictionary using a webcam and artificial neural networks. Hu moments is used for feature extraction. 200 samples of each of 5 two-handed signs were collected from an adult signer. 150 samples of each sign were used for training an artificial neural networks to perform the recognition. The performance is obtained by testing the trained system on the remaining 50 samples of each sign. A recognition rate of 87.6% on the testing data was obtained. When more signs will be considered, the artificial neural networks must be retrained so that signs are recognized and categorized.
(2014). Arabic Sign Language Recognition Using Neural network. Journal of the ACS Advances in Computer Science, 8(1), 1-13. doi: 10.21608/asc.2014.158142
MLA
. "Arabic Sign Language Recognition Using Neural network", Journal of the ACS Advances in Computer Science, 8, 1, 2014, 1-13. doi: 10.21608/asc.2014.158142
HARVARD
(2014). 'Arabic Sign Language Recognition Using Neural network', Journal of the ACS Advances in Computer Science, 8(1), pp. 1-13. doi: 10.21608/asc.2014.158142
VANCOUVER
Arabic Sign Language Recognition Using Neural network. Journal of the ACS Advances in Computer Science, 2014; 8(1): 1-13. doi: 10.21608/asc.2014.158142