Perspectives of Gestures for Gestural-Based Interaction Systems: Towards Natural Interaction

A frequently mentioned benefit of gesture-based input to computing systems is that it provides naturalness in interaction. However, it is not uncommon to find gesture sets consisting of arbitrary (hand) formations with illogically-mapped functions. This defeat the purpose of using gestures as a means to facilitate natural interaction. The root of the issue seems to stem from a separation between what is deemed as gesture in the computing field and what is deemed as gesture linguistically. To find a common ground, this paper explores the fundamental aspects of gestures in the literature of psycho-linguistic-based studies and HCI-based studies. The discussion focuses on the connection between the two perspectives – in the definition aspect through the concept of meaning and context, and in the classification aspect through the mapping of tasks (manipulative or communicative) to gesture functions (ergotic, epistemic or semiotic). By highlighting how these two perspectives interrelate, this paper provides a basis for research works that intend to propose gestures as the interaction modality for interactive systems.

[1]  Richard A. Bolt,et al.  “Put-that-there”: Voice and gesture at the graphics interface , 1980, SIGGRAPH '80.

[2]  Jacob O. Wobbrock,et al.  Modeling Collaboration Patterns on an Interactive Tabletop in a Classroom Setting , 2016, CSCW.

[3]  Claude Cadoz,et al.  Enaction and Enactive Interfaces : A Handbook of terms , 2007 .

[4]  Lorenzo Seidenari,et al.  Emergency medicine training with gesture driven interactive 3D simulations , 2012, UXeLATE '12.

[5]  Rashid Ansari,et al.  Multimodal human discourse: gesture and speech , 2002, TCHI.

[6]  Bruce A. Draper,et al.  Exploring the Use of Gesture in Collaborative Tasks , 2017, CHI Extended Abstracts.

[7]  Alireza Sahami Shirazi,et al.  Designing Consistent Gestures Across Device Types: Eliciting RSVP Controls for Phone, Watch, and Glasses , 2018, CHI.

[8]  Lorenzo Magnani,et al.  Model-Based and Manipulative Abduction in Science , 2004 .

[9]  Jakob Nielsen,et al.  Gestural interfaces: a step backward in usability , 2010, INTR.

[10]  Alan Wexelblat Research Challenges in Gesture: Open Issues and Unsolved Problems , 1997, Gesture Workshop.

[11]  Ken-ichi Okada,et al.  Triage training system: adjusting the difficulty level according to user proficiency , 2015, MUM.

[12]  Thomas B. Moeslund,et al.  A Procedure for Developing Intuitive and Ergonomic Gesture Interfaces for HCI , 2003, Gesture Workshop.

[13]  Sukeshini A. Grandhi,et al.  Understanding naturalness and intuitiveness in gesture production: insights for touchless gestural interfaces , 2011, CHI.

[14]  Hiroshi Shigeno,et al.  Proposal of an Architecture and Implementation of a Triage Support System , 2016, CollabTech.

[15]  Jarrett Webb,et al.  Beginning Kinect Programming with the Microsoft Kinect SDK , 2012, Apress.

[16]  Laurent Grisoni,et al.  Mimetic interaction spaces: controlling distant displays in pervasive environments , 2014, IUI.

[17]  Marc Erich Latoschik,et al.  “Stop over there”: natural gesture and speech interaction for non-critical spontaneous intervention in autonomous driving , 2017, ICMI.

[18]  Adam Kendon,et al.  How gestures can become like words , 1988 .

[19]  Guoying Zhao,et al.  An immersive fire training system using kinect , 2014, UbiComp Adjunct.

[20]  Masitah Ghazali,et al.  TOWARDS MOVEMENT-BASED INTERACTION FOR TAWAF TRAINING INVESTIGATING ITS SUITABILITY FOR OLDER ADULTS , 2015 .

[21]  Elena Mugellini,et al.  Gesturing on the Steering Wheel: a User-elicited taxonomy , 2014, AutomotiveUI.

[22]  Margot Brereton,et al.  Movements in gesture interfaces , 2005 .

[23]  Xiangshi Ren,et al.  Designing concurrent full-body gestures for intense gameplay , 2015, Int. J. Hum. Comput. Stud..

[24]  Andries van Dam,et al.  Post-WIMP user interfaces , 1997, CACM.

[25]  Paul P. Maglio,et al.  On Distinguishing Epistemic from Pragmatic Action , 1994, Cogn. Sci..

[26]  Markku Turunen,et al.  Designing Gesture-Based Control for Factory Automation , 2013, INTERACT.

[27]  Geehyuk Lee,et al.  Area gestures for a laptop computer enabled by a hover-tracking touchpad , 2012, APCHI '12.

[28]  C. Creider Hand and Mind: What Gestures Reveal about Thought , 1994 .

[29]  James W. Davis,et al.  GESTURE RECOGNITION , 2023, International Research Journal of Modernization in Engineering Technology and Science.

[30]  mc schraefel,et al.  A Taxonomy of Gestures in Human Computer Interactions , 2005 .

[31]  Olivier Chapuis,et al.  Designing Coherent Gesture Sets for Multi-scale Navigation on Tabletops , 2018, CHI.

[32]  Andries van Dam User interfaces: disappearing, dissolving, and evolving , 2001, CACM.

[33]  Jacqueline Urakami Cross-cultural comparison of hand gestures of Japanese and Germans for tabletop systems , 2014, Comput. Hum. Behav..

[34]  Elisabeth André,et al.  User-Defined Body Gestures for an Interactive Storytelling Scenario , 2013, INTERACT.

[35]  Dean Rubine,et al.  The automatic recognition of gestures , 1992 .

[36]  David McNeill,et al.  Gesture and Communication , 2021, Variability and Consistency in Early Language Learning.

[37]  Margot Brereton Workshop proceedings: Approaches to Movement-Based Interaction (W9) - Workshop held in Conjunction with Critical Computing 2005 Between Sense and Sensibility, The Fourth Decennial Aarhus Conference in Aarhus, Denmark, 20th-24th August 2005 , 2005 .

[38]  Meredith Ringel Morris,et al.  User-defined gestures for surface computing , 2009, CHI.

[39]  Donald A. Norman,et al.  Natural user interfaces are not natural , 2010, INTR.

[40]  Christian Kray,et al.  Enabling remote deictic communication with mobile devices: an elicitation study , 2017, MobileHCI.

[41]  Mohammed Yeasin,et al.  Speech-gesture driven multimodal interfaces for crisis management , 2003, Proc. IEEE.