User-Defined Gestures for Gestural Interaction: Extending from Hands to Other Body Parts

ABSTRACT Most gestural interaction studies on gesture elicitation have focused on hand gestures, and few have considered the involvement of other body parts. Moreover, most of the relevant studies used the frequency of the proposed gesture as the main index, and the participants were not familiar with the design space. In this study, we developed a gesture set that includes hand and non-hand gestures by combining the indices of gesture frequency, subjective ratings, and physiological risk ratings. We first collected candidate gestures in Experiment 1 through a user-defined method by requiring participants to perform gestures of their choice for 15 most commonly used commands, without any body part limitations. In Experiment 2, a new group of participants evaluated the representative gestures obtained in Experiment 1. We finally obtained a gesture set that included gestures made with the hands and other body parts. Three user characteristics were exhibited in this set: a preference for one-handed movements, a preference for gestures with social meaning, and a preference for dynamic gestures over static gestures.

[1]  Radu-Daniel Vatavu,et al.  On free-hand TV control: experimental results on user-elicited gestures with Leap Motion , 2015, Personal and Ubiquitous Computing.

[2]  Yoshifumi Kitamura,et al.  Body-centric interaction techniques for very large wall displays , 2010, NordiCHI.

[3]  Jan Stegenga,et al.  Suitability of Kinect for measuring whole body movement patterns during exergaming. , 2014, Journal of biomechanics.

[4]  E. Coiera Interaction Design , 2002 .

[5]  Jonathan Wheat,et al.  The accuracy of the Microsoft Kinect in joint angle measurement , 2014 .

[6]  Pang Xiao-yu Human Factor Studies on Gestural Interaction:Past,Present,and Future , 2014 .

[7]  Jakob Nielsen,et al.  Gestural interfaces: a step backward in usability , 2010, INTR.

[8]  James R. Lewis,et al.  Psychometric Evaluation of the PSSUQ Using Data from Five Years of Usability Studies , 2002, Int. J. Hum. Comput. Interact..

[9]  Jose Antonio Diego-Mas,et al.  Using Kinect™ sensor in observational methods for assessing postures at work. , 2014, Applied ergonomics.

[10]  Magid Igbaria,et al.  Correlates of user satisfaction with end user computing: An exploratory study , 1990, Inf. Manag..

[11]  Sunghyuk Kwon,et al.  Can User-Derived Gesture be Considered as the Best Gesture for a Command?: Focusing on the Commands for Smart Home System , 2012 .

[12]  Ruigang Yang,et al.  Accuracy and repeatability of joint angles measured using a single camera markerless motion capture system. , 2014, Journal of biomechanics.

[13]  Jianmin Wang,et al.  Design of Bare-hand Gestures for Object Manipulation in Virtual Reality ⋆ , 2013 .

[14]  Donald A. Norman,et al.  Natural user interfaces are not natural , 2010, INTR.

[15]  Michel Beaudouin-Lafon,et al.  Charade: remote control of objects using free-hand gestures , 1993, CACM.

[16]  Alex Pentland,et al.  Human computing and machine understanding of human behavior: a survey , 2006, ICMI '06.

[17]  Radu-Daniel Vatavu,et al.  Leap gestures for TV: insights from an elicitation study , 2014, TVX.

[18]  Brad A. Myers,et al.  Maximizing the guessability of symbolic input , 2005, CHI Extended Abstracts.

[19]  M. Hunt,et al.  Validity of the Microsoft Kinect for providing lateral trunk lean feedback during gait retraining. , 2013, Gait & posture.

[20]  Ying-Chao Tung,et al.  User-Defined Game Input for Smart Glasses in Public Space , 2015, CHI.

[21]  James R. Lewis,et al.  IBM computer usability satisfaction questionnaires: Psychometric evaluation and instructions for use , 1995, Int. J. Hum. Comput. Interact..

[22]  Alex Pentland,et al.  Human Computing and Machine Understanding of Human Behavior: A Survey , 2007, Artifical Intelligence for Human Computing.

[23]  Andy Cockburn,et al.  User-defined gestures for augmented reality , 2013, INTERACT.

[24]  Radu-Daniel Vatavu,et al.  User-defined gestures for free-hand TV control , 2012, EuroITV.

[25]  Per Ola Kristensson,et al.  Memorability of pre-designed and user-defined gesture sets , 2013, CHI.

[26]  Y. Rogers,et al.  Interaction Design , 2002 .

[27]  René de la Barré,et al.  Touchless Interaction-Novel Chances and Challenges , 2009, HCI.

[28]  Radu-Daniel Vatavu,et al.  Formalizing Agreement Analysis for Elicitation Studies: New Measures, Significance Test, and Toolkit , 2015, CHI.

[29]  Meredith Ringel Morris,et al.  User-defined gestures for surface computing , 2009, CHI.

[30]  Mohammad Obaid,et al.  User-Defined Body Gestures for Navigational Control of a Humanoid Robot , 2012, ICSR.

[31]  Darren Cosker,et al.  3D Gesture Recognition: An Evaluation of User and System Performance , 2011, Pervasive.

[32]  Doug A. Bowman,et al.  Design and evaluation of menu systems for immersive virtual environments , 2001, Proceedings IEEE Virtual Reality 2001.

[33]  Ettore Pennestrì,et al.  Comparison between low-cost marker-less and high-end marker-based motion capture systems for the computer-aided assessment of working ergonomics , 2016, Ergonomics.

[34]  Bongshin Lee,et al.  Reducing legacy bias in gesture elicitation studies , 2014, INTR.

[35]  Mowei Shen,et al.  Storing fine detailed information in visual working memory--evidence from event-related potentials. , 2009, Journal of vision.

[36]  Huiyue Wu,et al.  User-Defined Body Gestures for TV-based Applications , 2012, 2012 Fourth International Conference on Digital Home.

[37]  M. Studdert-Kennedy Hand and Mind: What Gestures Reveal About Thought. , 1994 .

[38]  Daniel Vogel,et al.  Distant freehand pointing and clicking on very large, high resolution displays , 2005, UIST.

[39]  P. Olivier,et al.  Accuracy of the Microsoft Kinect sensor for measuring movement in people with Parkinson's disease. , 2014, Gait & posture.

[40]  P. Cavanagh,et al.  The Capacity of Visual Short-Term Memory is Set Both by Visual Information Load and by Number of Objects , 2004, Psychological science.

[41]  J. G. Hollands,et al.  Engineering Psychology and Human Performance , 1984 .

[42]  Saeed Behzadipour,et al.  Accuracy of Kinect’s skeleton tracking for upper body rehabilitation applications , 2014, Disability and rehabilitation. Assistive technology.

[43]  Sheau-Farn Max Liang Control with Hand Gestures in Home Environment: A Review , 2013 .

[44]  Juan Pablo Wachs,et al.  A User-Developed 3-D Hand Gesture Set for Human–Computer Interaction , 2015, Hum. Factors.

[45]  Thomas B. Moeslund,et al.  A Procedure for Developing Intuitive and Ergonomic Gesture Interfaces for HCI , 2003, Gesture Workshop.

[46]  Sayan Sarcar,et al.  Designing Mid-Air TV Gestures for Blind People Using User- and Choice-Based Elicitation Approaches , 2016, Conference on Designing Interactive Systems.

[47]  Frank Weichert,et al.  Analysis of the Accuracy and Robustness of the Leap Motion Controller , 2013, Sensors.

[48]  Hongan Wang,et al.  An Empirical Study on the Interaction Capability of Arm Stretching , 2017, Int. J. Hum. Comput. Interact..

[49]  Sebastian Möller,et al.  I'm home: Defining and evaluating a gesture set for smart-home control , 2011, Int. J. Hum. Comput. Stud..

[50]  Xiangshi Ren,et al.  Designing concurrent full-body gestures for intense gameplay , 2015, Int. J. Hum. Comput. Stud..

[51]  S Hignett,et al.  Rapid entire body assessment (REBA). , 2000, Applied ergonomics.