Human gesture recognition through a Kinect sensor

Gesture recognition can be applied to many research areas, such as vision-based interface, communication and human robot interaction (HRI). This paper implements a non-intrusive, real-time gesture recognition system using a depth sensor. Related features are obtained from the human skeleton model generated by the Kinect sensor. Hidden Markov Models (HMMs) are used to model the dynamics of the gestures. We conducted offline experiments to check the accuracy and robustness of the system. Online experiments were also performed to verify the real-time requirement. Final results indicate that the average recognition accuracy is around 85% for the subject who provides the training data and 73% for the other subject who does not. The system was also used to interact with a mobile robot through gestures. This application indicates that it is robust to work in real-time.

[1]  D.M. Mount,et al.  An Efficient k-Means Clustering Algorithm: Analysis and Implementation , 2002, IEEE Trans. Pattern Anal. Mach. Intell..

[2]  Pierre Baldi,et al.  Smooth On-Line Learning Algorithms for Hidden Markov Models , 1994, Neural Computation.

[3]  D. Rubin,et al.  Maximum likelihood from incomplete data via the EM - algorithm plus discussions on the paper , 1977 .

[4]  Mario Hernández-Tejera,et al.  Hand Gesture Recognition for Human-Machine Interaction , 2004, WSCG.

[5]  Antonis A. Argyros,et al.  Vision-based Hand Gesture Recognition for Human-Computer Interaction , 2008 .

[6]  Weihua Sheng,et al.  Wearable sensors based human intention recognition in smart assisted living systems , 2008, 2008 International Conference on Information and Automation.

[7]  Zhengyou Zhang,et al.  Microsoft Kinect Sensor and Its Effect , 2012, IEEE Multim..

[8]  Bart Selman,et al.  Human Activity Detection from RGBD Images , 2011, Plan, Activity, and Intent Recognition.

[9]  Yangsheng Xu,et al.  Online, interactive learning of gestures for human/robot interfaces , 1996, Proceedings of IEEE International Conference on Robotics and Automation.

[10]  Vladimir Pavlovic,et al.  Visual Interpretation of Hand Gestures for Human-Computer Interaction: A Review , 1997, IEEE Trans. Pattern Anal. Mach. Intell..

[11]  Stefan Müller,et al.  Hand Gesture Recognition with a Novel IR Time-of-Flight Range Camera-A Pilot Study , 2007, MIRAGE.

[12]  Federico Thomas,et al.  Overcoming Superstrictness in Line Drawing Interpretation , 2002, IEEE Trans. Pattern Anal. Mach. Intell..

[13]  Lawrence R. Rabiner,et al.  A tutorial on hidden Markov models and selected applications in speech recognition , 1989, Proc. IEEE.

[14]  Donald O. Tanguay Hidden Markov models for gesture recognition , 1995 .

[15]  Sébastien Marcel,et al.  Hand gesture recognition using input-output hidden Markov models , 2000, Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580).

[16]  Morgan Quigley,et al.  ROS: an open-source Robot Operating System , 2009, ICRA 2009.

[17]  Ying Wu,et al.  Vision-Based Gesture Recognition: A Review , 1999, Gesture Workshop.

[18]  Georg Scharfenberg,et al.  Person Authentication by Handwriting in air using a Biometric Smart Pen Device , 2011, BIOSIG.

[19]  Jr. G. Forney,et al.  Viterbi Algorithm , 1973, Encyclopedia of Machine Learning.