Markerless Hand Gesture Interface Based on LEAP Motion Controller

Hand gesture interfaces provide an intuitive and natural way for interacting with a wide range of applications. Nowadays, the development of these interfaces is supported by an increasing number of sensing devices which are able to track hand and finger movements. Despite this, the physical and technical features of many of these devices make them unsuitable for the implementation of interfaces oriented to the everyday desktop applications. Conversely, the LEAP motion controller has been specifically designed to interact with these applications. Moreover, this latter device has been equipped with a hand skeletal model that provides tracking data with a high level of accuracy. This paper describes a novel approach to define and recognize hand gestures. The proposed method adopts freehand drawing recognition algorithms to interpret the tracking data of the hand and finger movements. Although our approach is applicable to any hand skeletal model, the overall features of that provided by the LEAP motion controller have driven us to use it as a reference model. Extensive preliminary tests have demonstrated the usefulness and the accuracy of the proposed method.

[1]  Tae-Kyun Kim,et al.  Real-Time Articulated Hand Pose Estimation Using Semi-supervised Transductive Regression Forests , 2013, 2013 IEEE International Conference on Computer Vision.

[2]  Luigi Cinque,et al.  SketchSPORE: A Sketch Based Domain Separation and Recognition System for Interactive Interfaces , 2013, ICIAP.

[3]  HiltonAdrian,et al.  A survey of advances in vision-based human motion capture and analysis , 2006 .

[4]  Christine Alvarado,et al.  SketchREAD: a multi-domain sketch recognition engine , 2004, UIST '04.

[5]  Joaquim A. Jorge,et al.  Experimental evaluation of an on-line scribble recognizer , 2001, Pattern Recognit. Lett..

[6]  Tracy Anne Hammond,et al.  LADDER: a language to describe drawing, display, and editing in sketch recognition , 2003, IJCAI.

[7]  Frank Weichert,et al.  Analysis of the Accuracy and Robustness of the Leap Motion Controller , 2013, Sensors.

[8]  Antonis A. Argyros,et al.  Markerless and Efficient 26-DOF Hand Pose Recovery , 2010, ACCV.

[9]  Danilo Avola,et al.  Color-Based Recognition of Gesture-traced 2D Symbols , 2011, DMS.

[10]  Adrian Hilton,et al.  A survey of advances in vision-based human motion capture and analysis , 2006, Comput. Vis. Image Underst..

[11]  Levent Burak Kara,et al.  An image-based, trainable symbol recognizer for hand-drawn sketches , 2005, Comput. Graph..

[12]  Danilo Avola,et al.  Design of an efficient framework for fast prototyping of customized human-computer interfaces and virtual environments for rehabilitation , 2013, Comput. Methods Programs Biomed..

[13]  Ying Wu,et al.  Vision-Based Gesture Recognition: A Review , 1999, Gesture Workshop.

[14]  Luigi Cinque,et al.  Overall design and implementation of the virtual glove , 2013, Comput. Biol. Medicine.

[15]  Danilo Avola,et al.  FcBD: An Agent-Based Architecture to Support Sketch Recognition Interfaces , 2011, DMS.

[16]  Viviana Mascardi,et al.  An agent-based framework for sketched symbol interpretation , 2008, J. Vis. Lang. Comput..

[17]  Giuseppe Placidi,et al.  A smart virtual glove for the hand telerehabilitation , 2007, Comput. Biol. Medicine.

[18]  Danilo Avola,et al.  SketchML a representation language for novel sketch recognition approach , 2009, PETRA '09.