One shot learning gesture recognition with Kinect sensor

Gestures are both natural and intuitive for Human-Computer-Interaction (HCI) and the one-shot learning scenario is one of the real world situations in terms of gesture recognition problems. In this demo, we present a hand gesture recognition system using the Kinect sensor, which addresses the problem of one-shot learning gesture recognition with a user-defined training and testing system. Such a system can behave like a remote control where the user can allocate a specific function using a prefered gesture by performing it only once. To adopt the gesture recognition framework, the system first automatically segments an action sequence into atomic tokens, and then adopts the Extended-Motion-History-Image (Extended-MHI) for motion feature representation. We evaluate the performance of our system quantitatively in Chalearn Gesture Challenge, and apply it to a virtual one shot learning gesture recognition system.

[1]  Sangyoun Lee,et al.  3D hand tracking using Kalman filter in depth space , 2012, EURASIP J. Adv. Signal Process..

[2]  Junsong Yuan,et al.  Robust hand gesture recognition with kinect sensor , 2011, ACM Multimedia.

[3]  Ling Shao,et al.  Silhouette Analysis-Based Action Recognition Via Exploiting Human Poses , 2013, IEEE Transactions on Circuits and Systems for Video Technology.

[4]  Ling Shao,et al.  One shot learning gesture recognition from RGBD images , 2012, 2012 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops.