Gesture vs. Gesticulation: A Test Protocol

In the last years, gesture recognition has gained increased attention in Human-Computer Interaction community. However, gesture segmentation, which is one of the most challenging tasks in gesture recognition applications, is still an open issue. Gesture segmentation has two main objectives: first, detecting when a gesture begins and ends; second, recognizing whether a gesture is meant to be meaningful for the machine or is a non-command gesture (such as gesticulation). This paper proposes a novel test protocol for the evaluation of different techniques separating command gestures from non-command gestures. Finally, we show how we adapted adopted our test protocol to design a touchless, always available interaction system, in which the user communicates directly with the computer through a wearable and "intimate" interface based on electromyographic signals.

[1]  Desney S. Tan,et al.  Enabling always-available input with muscle-computer interfaces , 2009, UIST '09.

[2]  Alan Wexelblat,et al.  An approach to natural gesture in virtual environments , 1995, TCHI.

[3]  Sethuraman Panchanathan,et al.  Gesture segmentation in complex motion sequences , 2003, Proceedings 2003 International Conference on Image Processing (Cat. No.03CH37429).

[4]  Sanshzar Kettebekov Exploiting prosodic structuring of coverbal gesticulation , 2004, ICMI '04.

[5]  Richard A. Bolt,et al.  “Put-that-there”: Voice and gesture at the graphics interface , 1980, SIGGRAPH '80.

[6]  Ronan Billon,et al.  Gesture recognition in flow based on PCA analysis using multiagent system , 2008, ACE '08.

[7]  Kongqiao Wang,et al.  Automatic recognition of sign language subwords based on portable accelerometer and EMG sensors , 2010, ICMI-MLMI '10.

[8]  D. McNeill Gesture and Thought , 2005 .

[9]  B. Freriks,et al.  Development of recommendations for SEMG sensors and sensor placement procedures. , 2000, Journal of electromyography and kinesiology : official journal of the International Society of Electrophysiological Kinesiology.

[10]  Enrico Costanza,et al.  Toward subtle intimate interfaces for mobile devices using an EMG controller , 2005, CHI.

[11]  J. B. Brooke,et al.  SUS: A 'Quick and Dirty' Usability Scale , 1996 .

[12]  Daniel Kelly,et al.  A framework for continuous multimodal sign language recognition , 2009, ICMI-MLMI '09.

[13]  Elena Mugellini,et al.  Gesture Segmentation and Recognition with an EMG-Based Intimate Approach - An Accuracy and Usability Study , 2012, 2012 Sixth International Conference on Complex, Intelligent, and Software Intensive Systems.