Hand gesture recognition using Kinect

Hand gesture recognition (HGR) is an important research topic because some situations require silent communication with sign languages. Computational HGR systems assist silent communication, and help people learn a sign language. In this article, a novel method for contact-less HGR using Microsoft Kinect for Xbox is described, and a real-time HGR system is implemented. The system is able to detect the presence of gestures, to identify fingers, and to recognize the meanings of nine gestures in a pre-defined Popular Gesture scenario. The accuracy of the HGR system is from 84% to 99% with single-hand gestures, and from 90% to 100% if both hands perform the same gesture at the same time. Because the depth sensor of Kinect is an infrared camera, the lighting conditions, signers' skin colors and clothing, and background have little impact on the performance of this system. The accuracy and the robustness make this system a versatile component that can be integrated in a variety of applications in daily life.

[1]  Ankit Chaudhary,et al.  Tracking of Fingertips and Centers of Palm Using KINECT , 2011, 2011 Third International Conference on Computational Intelligence, Modelling & Simulation.

[2]  Ronald L. Graham,et al.  An Efficient Algorithm for Determining the Convex Hull of a Finite Planar Set , 1972, Inf. Process. Lett..

[3]  T.D. Bui,et al.  Recognizing Postures in Vietnamese Sign Language With MEMS Accelerometers , 2007, IEEE Sensors Journal.

[4]  Holger Kenn,et al.  A glove-based gesture interface for wearable computing applications , 2007 .

[5]  Hanseok Ko,et al.  Gesture recognition using depth-based hand tracking for contactless controller application , 2012, 2012 IEEE International Conference on Consumer Electronics (ICCE).

[6]  Hermann Ney,et al.  Benchmark Databases for Video-Based Automatic Sign Language Recognition , 2008, LREC.

[7]  L. Van Gool,et al.  Combining RGB and ToF cameras for real-time 3D hand gesture interaction , 2011, 2011 IEEE Workshop on Applications of Computer Vision (WACV).

[8]  Alex Pentland,et al.  Real-Time American Sign Language Recognition Using Desk and Wearable Computer Based Video , 1998, IEEE Trans. Pattern Anal. Mach. Intell..

[9]  Youngmo Han A low-cost visual motion data glove as an input device to interpret human hand gestures , 2010, IEEE Transactions on Consumer Electronics.

[10]  Sung-Tae Jung,et al.  Real-time gesture recognition using 3D depth camera , 2011, 2011 IEEE 2nd International Conference on Software Engineering and Service Science.

[11]  Manolis I. A. Lourakis,et al.  Binocular Hand Tracking and Reconstruction Based on 2D Shape Matching , 2006, 18th International Conference on Pattern Recognition (ICPR'06).

[12]  Robert W. Lindeman,et al.  A multi-class pattern recognition system for practical finger spelling translation , 2002, Proceedings. Fourth IEEE International Conference on Multimodal Interfaces.

[13]  Andrew Zisserman,et al.  Long Term Arm and Hand Tracking for Continuous Sign Language TV Broadcasts , 2008, BMVC.

[14]  Hermann Ney,et al.  Tracking Benchmark Databases for Video-Based Sign Language Recognition , 2010, ECCV Workshops.

[15]  John N. Lygouras,et al.  Data glove with a force sensor , 2003, IEEE Trans. Instrum. Meas..

[16]  Tae-Seong Kim,et al.  3-D hand motion tracking and gesture recognition using a data glove , 2009, 2009 IEEE International Symposium on Industrial Electronics.

[17]  Richard Bowden,et al.  A boosted classifier tree for hand shape detection , 2004, Sixth IEEE International Conference on Automatic Face and Gesture Recognition, 2004. Proceedings..

[18]  Richard Bowden,et al.  Large Lexicon Detection of Sign Language , 2007, ICCV-HCI.