Sign-Language Recognition Through Gesture & Movement Analysis (SIGMA)

Sign language is used as the means of communication by the speech-impaired. It is a language which uses hand gestures and movements to convey meaning. Several studies have attempted to use computers to recognize hand gestures using either a data glove, which uses several sensors to determine hand pose and attitude, or a vision based system, which uses a camera to determine the hand position and gestures made. We introduce a system that combines a prototype data glove with computer vision in order to translate Filipino Sign Language for medical purposes. The system uses the Viterbi algorithm to find the best gesture fit from the HMM modeling. Three training sets were used to validate the systems’ accuracy of recognition. The first set focused on the alphabet and numbers and achieved a recognition accuracy of 71.8%. The second training set was composed of health-care related words, which produced a recognition rate of 80.6%. The third set comprised the full vocabulary of 26 letters of the alphabet, numbers zero to nine, and 30 words commonly used in health-care related work, resulting in an accuracy of 80.55%.