Creating 3D/Mid-air gestures

Researchers and developers have continually proposed various forms of gestures for computing applications. Real problems arise when choosing the best set of gestures for a given application, and there is a strong debate in the literature on whether to include users in the design process of the gestures. This paper elaborates on this debate by synthesizing the ideas and theories put forth by previous work, and describes the emergence of user-centered approach amongst the primary domination of developer-based approach in this particular research area. Three influential methods were summarized to represents the essence of the user-centered approach; and recent works that applied these methods were reviewed to consider the various ways they were adopted and adapted in creating gesture languages for computing systems. By presenting the overview of our observation and findings, we hope to provide another perspective on the user-centered design approach that would be of assistance to other researchers with similar interests in this area.

[1]  Joseph J. LaViola,et al.  Exploring strategies and guidelines for developing full body video game interfaces , 2010, FDG.

[2]  Sebastian Möller,et al.  I'm home: Defining and evaluating a gesture set for smart-home control , 2011, Int. J. Hum. Comput. Stud..

[3]  Xiangshi Ren,et al.  Designing concurrent full-body gestures for intense gameplay , 2015, Int. J. Hum. Comput. Stud..

[4]  Brad A. Myers,et al.  Maximizing the guessability of symbolic input , 2005, CHI Extended Abstracts.

[5]  Anton Nijholt,et al.  User-Evaluated Gestures for Touchless Interactions from a Distance , 2010, 2010 IEEE International Symposium on Multimedia.

[6]  Meredith Ringel Morris,et al.  User-defined gestures for surface computing , 2009, CHI.

[7]  Uran Oh,et al.  The challenges and potential of end-user gesture customization , 2013, CHI.

[8]  Patrick Saalfeld,et al.  A gesture-controlled projection display for CT-guided interventions , 2015, International Journal of Computer Assisted Radiology and Surgery.

[9]  J. F. Kelley,et al.  An empirical methodology for writing user-friendly natural language computer applications , 1983, CHI '83.

[10]  Yang Li,et al.  User-defined motion gestures for mobile interaction , 2011, CHI.

[11]  Katrin Wolf,et al.  Touching the void: gestures for auditory interfaces , 2011, Tangible and Embedded Interaction.

[12]  Niels Henze,et al.  User-centred process for the definition of free-hand gestures applied to controlling music playback , 2012, Multimedia Systems.

[13]  Abdulmotaleb El-Saddik,et al.  An Elicitation Study on Gesture Preferences and Memorability Toward a Practical Hand-Gesture Vocabulary for Smart Televisions , 2015, IEEE Access.

[14]  Huiyue Wu,et al.  User-Defined Body Gestures for TV-based Applications , 2012, 2012 Fourth International Conference on Digital Home.

[15]  Richard A. Bolt,et al.  “Put-that-there”: Voice and gesture at the graphics interface , 1980, SIGGRAPH '80.

[16]  Michael Rohs,et al.  User-defined gestures for connecting mobile phones, public displays, and tabletops , 2010, Mobile HCI.

[17]  Thomas B. Moeslund,et al.  A Procedure for Developing Intuitive and Ergonomic Gesture Interfaces for HCI , 2003, Gesture Workshop.