Arabic Sign Language Recognition Using Neural network

Document Type : Original Article

Abstract

Sign language maps letters, words, and expressions of a certain language to a set of hand
gestures enabling an individual to communicate by using hands and gestures rather than by
speaking. Systems capable of recognizing sign-language symbols can be used as a means of
communication between Hearing-impaired and vocal people. This paper represents an
attempt to recognize handed signs from the Unified Arabic Sign Language Dictionary using
a webcam and artificial neural networks. Hu moments is used for feature extraction. 200
samples of each of 5 two-handed signs were collected from an adult signer. 150 samples of
each sign were used for training an artificial neural networks to perform the recognition.
The performance is obtained by testing the trained system on the remaining 50 samples of
each sign. A recognition rate of 87.6% on the testing data was obtained. When more signs
will be considered, the artificial neural networks must be retrained so that signs are
recognized and categorized.