A New 2D Static Hand Gesture Colour Image Dataset for ASL Gestures

It usually takes a fusion of image processing and machine learning algorithms in order to build a fully-functioning computer vision system for hand gesture recognition. Fortunately, the complexity of developing such a system could be alleviated by treating the system as a collection of multiple sub-systems working together, in such a way that they can be dealt with in isolation. Machine learning need to feed on thousands of exemplars (e.g. images, features) to automatically establish some recognisable patterns for all possible classes (e.g. hand gestures) that applies to the problem domain. A good number of exemplars helps, but it is also important to note that the efficacy of these exemplars depends on the variability of illumination conditions, hand postures, angles of rotation, scaling and on the number of volunteers from whom the hand gesture images were taken. These exemplars are usually subjected to image processing first, to reduce the presence of noise and extract the important features from the images. These features serve as inputs to the machine learning system. Different sub-systems are integrated together to form a complete computer vision system for gesture recognition. The main contribution of this work is on the production of the exemplars. We discuss how a dataset of standard American Sign Language (ASL) hand gestures containing 2425 images from 5 individuals, with variations in lighting conditions and hand postures is generated with the aid of image processing techniques. A minor contribution is given in the form of a specific feature extraction method called moment invariants, for which the computation method and the values are furnished with the dataset.

[1]  Sébastien Marcel,et al.  Hand posture recognition in a body-face centered space , 1999, CHI Extended Abstracts.

[2]  Paul A. Viola,et al.  Robust Real-Time Face Detection , 2001, Proceedings Eighth IEEE International Conference on Computer Vision. ICCV 2001.

[3]  Jan Flusser,et al.  On the independence of rotation moment invariants , 2000, Pattern Recognit..

[4]  Sébastien Marcel,et al.  Hand gesture recognition using input-output hidden Markov models , 2000, Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580).

[5]  Jochen Triesch,et al.  A System for Person-Independent Hand Posture Recognition against Complex Backgrounds , 2001, IEEE Trans. Pattern Anal. Mach. Intell..

[6]  Napoleon H. Reyes,et al.  Analysis of Feature Invariance and Discrimination for Hand Images: Fourier Descriptors versus Momen , 2011 .

[7]  Thomas B. Moeslund,et al.  Pointing and Command Gestures for Augmented Reality , 2004 .

[8]  Jochen Triesch,et al.  Robust classification of hand postures against complex backgrounds , 1996, Proceedings of the Second International Conference on Automatic Face and Gesture Recognition.

[9]  Farhad Dadgostar,et al.  Real-time hand tracking using the viola and jones method , 2005, SIP.

[10]  Hermann Ney,et al.  Efficient approximations to model-based joint tracking and recognition of continuous sign language , 2008, 2008 8th IEEE International Conference on Automatic Face & Gesture Recognition.

[11]  F. Dadgostar,et al.  Real-time Hand Tracking based on Non-Invariant Features , 2005, 2005 IEEE Instrumentationand Measurement Technology Conference Proceedings.

[12]  Stan Sclaroff,et al.  Estimating 3D hand pose from a cluttered image , 2003, 2003 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2003. Proceedings..

[13]  Agnès Just,et al.  Two-Handed Gesture Recognition , 2005 .

[14]  Abdolhossein Sarrafzadeh,et al.  A Color Hand Gesture Database for Evaluating and Improving Algorithms on Hand Gesture and Posture Recognition , 2005 .

[15]  Hermann Ney,et al.  Speech recognition techniques for a sign language recognition system , 2007, INTERSPEECH.