Feature Processing and Modeling for 6D Motion Gesture Recognition

A 6D motion gesture is represented by a 3D spatial trajectory and augmented by another three dimensions of orientation. Using different tracking technologies, the motion can be tracked explicitly with the position and orientation or implicitly with the acceleration and angular speed. In this work, we address the problem of motion gesture recognition for command-and-control applications. Our main contribution is to investigate the relative effectiveness of various feature dimensions for motion gesture recognition in both user-dependent and user-independent cases. We introduce a statistical feature-based classifier as the baseline and propose an HMM-based recognizer, which offers more flexibility in feature selection and achieves better performance in recognition accuracy than the baseline system. Our motion gesture database which contains both explicit and implicit motion information allows us to compare the recognition performance of different tracking signals on a common ground. This study also gives an insight into the attainable recognition rate with different tracking devices, which is valuable for the system designer to choose the proper tracking technology.

[1]  Alex Pentland,et al.  Real-Time American Sign Language Recognition Using Desk and Wearable Computer Based Video , 1998, IEEE Trans. Pattern Anal. Mach. Intell..

[2]  Jin-Hyung Kim,et al.  An HMM-Based Threshold Model Approach for Gesture Recognition , 1999, IEEE Trans. Pattern Anal. Mach. Intell..

[3]  Dean Rubine,et al.  Specifying gestures by example , 1991, SIGGRAPH.

[4]  Greg Welch,et al.  Motion Tracking: No Silver Bullet, but a Respectable Arsenal , 2002, IEEE Computer Graphics and Applications.

[5]  Petros Maragos,et al.  Advances in phonetics-based sub-unit modeling for transcription alignment and sign language recognition , 2011, CVPR 2011 WORKSHOPS.

[6]  Ghassan AlRegib,et al.  Characteristics of spatio-temporal signals acquired by optical motion tracking , 2010, IEEE 10th INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING PROCEEDINGS.

[7]  Jani Mäntyjärvi,et al.  Enabling fast and effortless customisation in accelerometer based gesture interaction , 2004, MUM '04.

[8]  Robert J. Teather,et al.  Effects of tracking technology, latency, and spatial jitter on object movement , 2009, 2009 IEEE Symposium on 3D User Interfaces.

[9]  Biing-Hwang Juang,et al.  6D motion gesture recognition using spatio-temporal features , 2012, 2012 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[10]  Yang Li,et al.  Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes , 2007, UIST.

[11]  Michael J. Agnew,et al.  Accuracy of inertial motion sensors in static, quasistatic, and complex dynamic motion. , 2009, Journal of biomechanical engineering.

[12]  Zhen Wang,et al.  uWave: Accelerometer-based Personalized Gesture Recognition and Its Applications , 2009, PerCom.

[13]  Michael Rohs,et al.  Protractor3D: a closed-form solution to rotation-invariant 3D gestures , 2011, IUI '11.

[14]  S. Mitra,et al.  Gesture Recognition: A Survey , 2007, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[15]  Joseph J. LaViola,et al.  Breaking the status quo: Improving 3D gesture recognition with spatially convenient input devices , 2010, 2010 IEEE Virtual Reality Conference (VR).

[16]  Yang Li,et al.  User-defined motion gestures for mobile interaction , 2011, CHI.

[17]  Biing-Hwang Juang,et al.  6DMG: a new 6D motion gesture database , 2012, MMSys '12.

[18]  Biing-Hwang Juang,et al.  A new 6D motion gesture database and the benchmark results of feature-based statistical recognition , 2012, 2012 IEEE International Conference on Emerging Signal Processing Applications.

[19]  Tanja Schultz,et al.  Airwriting recognition using wearable motion sensors , 2010, AH.