Empirical Research in Mid-Air Interaction: A Systematic Review

ABSTRACT Mid-air interaction is a distinct style of natural HCI (Human-Computer Interaction). In mid-air interaction, users make use of their whole body—with a strong focus on hands—and apply gestures, postures, and movements to interact with digital content on distant displays or remote devices. Although the idea of exploiting body movements and gestures in HCI is not new, mid-air interaction is now possible due to the availability of reasonably priced depth cameras and sensors. This article presents a systematic review of empirical research in mid-air interaction, based on a corpus of 104 publications from 2011 to 2018. The review includes: (a) a retrospective on mid-air interaction, which outlines its historical development and clarifications on important concepts; (b) current application domains of mid-air interaction; (c) user requirements methods, focusing on gesture elicitation studies; (d) dimensions of prototyping and design; (e) empirical evaluation methods and issues; (f) a discussion on several trends and challenges for further research and development.

[1]  Timothy Brittain-Catlin Put it there , 2013 .

[2]  Edward Lank,et al.  Exploring At-Your-Side Gestural Interaction for Ubiquitous Environments , 2017, Conference on Designing Interactive Systems.

[3]  Davide Bolchini,et al.  Touchless circular menus: toward an intuitive UI for touchless interactions with large displays , 2014, AVI.

[4]  Yun Tian,et al.  CUDA-based real-time hand gesture interaction and visualization for CT volume dataset using leap motion , 2016, The Visual Computer.

[5]  Markku Turunen,et al.  Designing Gesture-Based Control for Factory Automation , 2013, INTERACT.

[6]  Fabio Remondino,et al.  Kinect and 3D GIS in archaeology , 2012, 2012 18th International Conference on Virtual Systems and Multimedia.

[7]  Judy Kay,et al.  An in-the-wild study of learning mid-air gestures to browse hierarchical information at a large interactive public display , 2015, UbiComp.

[8]  Karina Caro Corrales FroggyBobby: An exergame to support children with motor problems practicing motor coordination exercises during therapeutic interventions , 2017 .

[9]  Manolya Kavakli,et al.  User Evaluation of Hand Gestures for Designing an Intelligent In-Vehicle Interface , 2017, DESRIST.

[10]  Alexandru Dancu,et al.  The Ultimate Display , 2014 .

[11]  Maria Frucci,et al.  Experiencing touchless interaction with augmented content on wearable head-mounted displays in cultural heritage applications , 2017, Personal and Ubiquitous Computing.

[12]  Stephen A. Brewster,et al.  Do That, There: An Interaction Technique for Addressing In-Air Gesture Systems , 2016, CHI.

[13]  Poika Isokoski,et al.  Effects of auditory, haptic and visual feedback on performing gestures by gaze or by hand , 2016, Behav. Inf. Technol..

[14]  Andri Ioannou,et al.  Moving Bodies to Moving Minds: A Study of the Use of Motion-Based Games in Special Education , 2018, TechTrends.

[15]  Robert F Burkard,et al.  The Human Auditory Brain-stem Response to High Click Rates: Aging Effects. , 2001, American journal of audiology.

[16]  Frode Eika Sandnes,et al.  Identifying the Usability Factors of Mid-Air Hand Gestures for 3D Virtual Model Manipulation , 2017, HCI.

[17]  Huiyue Wu,et al.  User-Defined Body Gestures for TV-based Applications , 2012, 2012 Fourth International Conference on Digital Home.

[18]  Mirko Gelsomini,et al.  Motion-based touchless interaction for ASD children: a case study , 2014, AVI.

[19]  Tilbe Göksun,et al.  Hands as a Controller: User Preferences for Hand Specific On-Skin Gestures , 2017, Conference on Designing Interactive Systems.

[20]  Jason I. Hong,et al.  Wave to me: user identification using body lengths and natural gestures , 2014, CHI.

[21]  David Lindlbauer,et al.  Understanding Mid-Air Hand Gestures: A Study of Human Preferences in Usage of Gesture Types for HCI , 2012 .

[22]  Manfred Tscheligi,et al.  Evaluating performance and acceptance of older adults using freehand gestures for TV menu control , 2012, EuroITV.

[23]  Niels Henze,et al.  User-centred process for the definition of free-hand gestures applied to controlling music playback , 2012, Multimedia Systems.

[24]  Naphtali Rishe,et al.  Gesture elicitation for 3D travel via multi-touch and mid-Air systems for procedurally generated pseudo-universe , 2017, 2017 IEEE Symposium on 3D User Interfaces (3DUI).

[25]  Manolya Kavakli,et al.  Exploring a user-defined gesture vocabulary for descriptive mid-air interactions , 2018, Cognition, Technology & Work.

[26]  Kenton O'Hara,et al.  Assessing Multiple Sclerosis With Kinect: Designing Computer Vision Systems for Real-World Use , 2016, Hum. Comput. Interact..

[27]  Regan L. Mandryk,et al.  Full-body motion-based game interaction for older adults , 2012, CHI.

[28]  Florian Alt,et al.  Pocket Transfers: Interaction Techniques for Transferring Content from Situated Displays to Mobile Devices , 2018, CHI.

[29]  Radu-Daniel Vatavu,et al.  Smart-Pockets: Body-deictic gestures for fast access to personal data during ambient interactions , 2017, Int. J. Hum. Comput. Stud..

[30]  John Zimmerman,et al.  Research through design as a method for interaction design research in HCI , 2007, CHI.

[31]  Weiming Dong,et al.  Proceedings of the 10th International Conference on Virtual Reality Continuum and Its Applications in Industry , 2011, VRCAI 2011.

[32]  Panayiotis Koutsabasis,et al.  Mid-Air Browsing and Selection in Image Collections , 2016, AVI.

[33]  Nicolai Marquardt,et al.  Gesture Elicitation Study on How to Opt-in & Opt-out from Interactions with Public Displays , 2017, ISS.

[34]  Patrick Saalfeld,et al.  A gesture-controlled projection display for CT-guided interventions , 2015, International Journal of Computer Assisted Radiology and Surgery.

[35]  Eva Hornecker,et al.  Modifying Gesture Elicitation: Do Kinaesthetic Priming and Increased Production Reduce Legacy Bias? , 2016, Tangible and Embedded Interaction.

[36]  Simon Weidert,et al.  Device- and system-independent personal touchless user interface for operating rooms , 2016, International Journal of Computer Assisted Radiology and Surgery.

[37]  Radu-Daniel Vatavu,et al.  There's a world outside your TV: exploring interactions beyond the physical TV screen , 2013, EuroITV.

[38]  Jonas Fritsch,et al.  Kinesthetic interaction: revealing the bodily potential in interaction design , 2008, OZCHI.

[39]  Kasper Hornbæk,et al.  Vulture: a mid-air word-gesture keyboard , 2014, CHI.

[40]  Karthik Ramani,et al.  Extracting hand grasp and motion for intent expression in mid-air shape deformation: A concrete and iterative exploration through a virtual pottery application , 2016, Comput. Graph..

[41]  Yang Li,et al.  User-defined motion gestures for mobile interaction , 2011, CHI.

[42]  Kenton O'Hara,et al.  Interactional Order and Constructed Ways of Seeing with Touchless Imaging Systems in Surgery , 2014, Computer Supported Cooperative Work (CSCW).

[43]  Jan O. Borchers,et al.  Understanding finger input above desktop devices , 2014, CHI.

[44]  Alois Ferscha,et al.  Standardization of the in-car gesture interaction space , 2013, AutomotiveUI.

[45]  Alfredo Pina,et al.  Hierarchical Menu Selection with a Body-Centered Remote Interface , 2014, Interact. Comput..

[46]  Anders Markussen,et al.  Selection-Based Mid-Air Text Entry on Large Displays , 2013, INTERACT.

[47]  Eleni Stroulia,et al.  VirtualGym: A kinect-based system for seniors exercising at home , 2018, Entertain. Comput..

[48]  Hyunjeong Kim,et al.  Towards more natural digital content manipulation via user freehand gestural interaction in a living room , 2013, UbiComp.

[49]  Carl Gutwin,et al.  Analysis and comparison of target assistance techniques for relative ray-cast pointing , 2013, Int. J. Hum. Comput. Stud..

[50]  Jesper Kjeldskov,et al.  Investigating Cross-Device Interaction between a Handheld Device and a Large Display , 2017, CHI.

[51]  Yael Edan,et al.  Vision-based hand-gesture applications , 2011, Commun. ACM.

[52]  Christoph Düber,et al.  Speech and motion control for interventional radiology: requirements and feasibility , 2013, International Journal of Computer Assisted Radiology and Surgery.

[53]  Alvin Jude,et al.  An evaluation of touchless hand gestural interaction for pointing tasks with preferred and non-preferred hands , 2014, NordiCHI.

[54]  Tomer Moscovich Multi-touch interaction , 2006, CHI EA '06.

[55]  Xose Manuel Pardo,et al.  Gesture-based interaction with voice feedback for a tour-guide robot , 2014, J. Vis. Commun. Image Represent..

[56]  Thomas B. Moeslund,et al.  Chapter 3 – Gesture Interfaces , 2008 .

[57]  Jörg Müller,et al.  StrikeAPose: revealing mid-air gestures on public displays , 2013, CHI.

[58]  Yuanchun Shi,et al.  ATK: Enabling Ten-Finger Freehand Typing in Air Based on 3D Hand Tracking Data , 2015, UIST.

[59]  Spyros Vosinakis,et al.  Kinesthetic interactions in museums: conveying cultural heritage by making use of ancient tools and (re-) constructing artworks , 2018, Virtual Reality.

[60]  Vinayak,et al.  To Draw or Not to Draw: Recognizing Stroke-Hover Intent in Non-instrumented Gesture-free Mid-Air Sketching , 2018, IUI.

[61]  Brad A. Myers,et al.  Maximizing the guessability of symbolic input , 2005, CHI Extended Abstracts.

[62]  Linda Di Geronimo,et al.  Exploiting mid-air gestures to share data among devices , 2017, MobileHCI.

[63]  Chris North,et al.  Design and evaluation of freehand menu selection interfaces using tilt and pinch gestures , 2011, Int. J. Hum. Comput. Stud..

[64]  Christina Boucher,et al.  Exploring Non-touchscreen Gestures for Smartwatches , 2016, CHI.

[65]  Nan Jiang,et al.  Understanding the impact of multimodal interaction using gaze informed mid-air gesture control in 3D virtual objects manipulation , 2017, Int. J. Hum. Comput. Stud..

[66]  Kosuke Sato,et al.  LazyNav: 3D ground navigation with non-critical body parts , 2015, 2015 IEEE Symposium on 3D User Interfaces (3DUI).

[67]  Sayan Sarcar,et al.  Designing Mid-Air TV Gestures for Blind People Using User- and Choice-Based Elicitation Approaches , 2016, Conference on Designing Interactive Systems.

[68]  Carlo Maria Medaglia,et al.  A Usability Study of a Gesture Recognition System Applied During the Surgical Procedures , 2015, HCI.

[69]  Andy Cockburn,et al.  User-defined gestures for augmented reality , 2013, INTERACT.

[70]  Bongshin Lee,et al.  Reducing legacy bias in gesture elicitation studies , 2014, INTR.

[71]  Joseph J. LaViola,et al.  Exploring 3d gesture metaphors for interaction with unmanned aerial vehicles , 2013, IUI '13.

[72]  Peng Song,et al.  A handle bar metaphor for virtual object manipulation with mid-air interaction , 2012, CHI.

[73]  Morten Fjeld,et al.  How would you gesture navigate a drone?: a user-centered approach to control a drone , 2016, MindTrek.

[74]  Jeremy R. Cooperstock,et al.  Evaluation of Docking Task Performance Using Mid-air Interaction Techniques , 2015, SUI.

[75]  Philip Kortum,et al.  HCI Beyond the GUI: Design for Haptic, Speech, Olfactory, and Other Nontraditional Interfaces , 2008 .

[76]  Kris Luyten,et al.  Multi-viewer gesture-based interaction for omni-directional video , 2014, CHI.

[77]  Olivier Chapuis,et al.  Mid-Air Pointing on Ultra-Walls , 2015, TCHI.

[78]  Gabriel Zachmann,et al.  Kinaptic - Techniques and insights for creating competitive accessible 3D games for sighted and visually impaired users , 2016, 2016 IEEE Haptics Symposium (HAPTICS).

[79]  Kenton O'Hara,et al.  PathSync: Multi-User Gestural Interaction with Touchless Rhythmic Path Mimicry , 2016, CHI.

[80]  T. Brown Metaphor and Thought , 1981 .

[81]  Joseph J. LaViola,et al.  Exploring the usefulness of finger-based 3D gesture menu selection , 2014, CHI.

[82]  Thomas B. Moeslund,et al.  A Procedure for Developing Intuitive and Ergonomic Gesture Interfaces for HCI , 2003, Gesture Workshop.

[83]  Anupam Agrawal,et al.  Vision based hand gesture recognition for human computer interaction: a survey , 2012, Artificial Intelligence Review.

[84]  Ricardo Gutierrez-Osuna,et al.  INTERNATIONAL JOURNAL OF HEALTH GEOGRAPHICS EDITORIAL Open Access Web GIS in practice X: a Microsoft Kinect natural user interface for Google Earth navigation , 2022 .

[85]  Florian Müller,et al.  PalmRC: imaginary palm-based remote control for eyes-free television interaction , 2012, EuroITV.

[86]  Michael E. Atwood,et al.  Changing perspectives on evaluation in HCI: past, present, and future , 2013, CHI Extended Abstracts.

[87]  Sriram Subramanian,et al.  UltraHaptics: multi-point mid-air haptic feedback for touch surfaces , 2013, UIST.

[88]  Edward Lank,et al.  Watchpoint: Freehand Pointing with a Smartwatch in a Ubiquitous Display Environment , 2016, AVI.

[89]  Sebastian Möller,et al.  I'm home: Defining and evaluating a gesture set for smart-home control , 2011, Int. J. Hum. Comput. Stud..

[90]  Henry J. Gardner,et al.  Evidence from the surgeons: gesture control of image data displayed during surgery , 2016, Behav. Inf. Technol..

[91]  Thomas P. Moran,et al.  Pen-based interaction techniques for organizing material on an electronic whiteboard , 1997, UIST '97.

[92]  Xiangshi Ren,et al.  Designing concurrent full-body gestures for intense gameplay , 2015, Int. J. Hum. Comput. Stud..

[93]  Hiroshi Ishii,et al.  Tangible bits: towards seamless interfaces between people, bits and atoms , 1997, CHI.

[94]  Jörg Müller,et al.  Comparing Free Hand Menu Techniques for Distant Displays Using Linear, Marking and Finger-Count Menus , 2011, INTERACT.

[95]  Jianmin Wang,et al.  User-centered gesture development in TV viewing environment , 2014, Multimedia Tools and Applications.

[96]  Gang Ren,et al.  Freehand gestural text entry for interactive TV , 2013, EuroITV.

[97]  Denis Lalanne,et al.  Two Handed Mid-Air Gestural HCI: Point + Command , 2013, HCI.

[98]  Gang Ren,et al.  3D selection with freehand gesture , 2013, Comput. Graph..

[99]  James A. Landay,et al.  Drone & me: an exploration into natural human-drone interaction , 2015, UbiComp.

[100]  Michael Rohs,et al.  Let me grab this: a comparison of EMS and vibration for haptic feedback in free-hand interaction , 2014, AH.

[101]  Arjan Kuijper,et al.  Understanding People's Mental Models of Mid-Air Interaction for Virtual Assembly and Shape Modeling , 2016, CASA.

[102]  Teddy Seyed,et al.  User Elicitation on Single-hand Microgestures , 2016, CHI.

[103]  Jason Alexander,et al.  The Feet in Human--Computer Interaction , 2015, ACM Comput. Surv..

[104]  Radu-Daniel Vatavu,et al.  On free-hand TV control: experimental results on user-elicited gestures with Leap Motion , 2015, Personal and Ubiquitous Computing.

[105]  Jens Schneider,et al.  Investigating Freehand Pan and Zoom , 2012, Mensch & Computer.

[106]  Joseph J. LaViola,et al.  3D Gestural Interaction: The State of the Field , 2013 .

[107]  Hiroyuki Shinoda,et al.  HaptoMime: mid-air haptic interaction with a floating virtual screen , 2014, UIST.

[108]  Richard A. Bolt,et al.  “Put-that-there”: Voice and gesture at the graphics interface , 1980, SIGGRAPH '80.

[109]  Lina Lee,et al.  Information at the wave of your hand , 2014 .

[110]  Kasper Hornbæk,et al.  Old wine in new bottles or novel challenges: a critical analysis of empirical studies of user experience , 2011, CHI.

[111]  Sungmin Cho,et al.  3D Volume Drawing on a Potter's Wheel , 2014, IEEE Computer Graphics and Applications.

[112]  Alessio Malizia,et al.  A Touchless Gestural System for Extended Information Access Within a Campus , 2017, SIGUCCS.

[113]  Pattie Maes,et al.  SixthSense: a wearable gestural interface , 2009, SIGGRAPH ASIA Art Gallery & Emerging Technologies.

[114]  Deborah Sturm,et al.  Connecting Through Kinect: Designing and Evaluating a Collaborative Game with and for Autistic Individuals , 2017, HCI.

[115]  Meredith Ringel Morris,et al.  Web on the wall: insights from a multimodal interaction elicitation study , 2012, ITS.

[116]  Radu-Daniel Vatavu,et al.  Formalizing Agreement Analysis for Elicitation Studies: New Measures, Significance Test, and Toolkit , 2015, CHI.

[117]  Spyros Vosinakis,et al.  Evaluation of visual feedback techniques for virtual grasping with bare hands using Leap Motion and Oculus Rift , 2018, Virtual Reality.

[118]  Meredith Ringel Morris,et al.  User-defined gestures for surface computing , 2009, CHI.

[119]  Kibum Kim,et al.  Assisting people with visual impairments in aiming at a target on a large wall-mounted display , 2016, Int. J. Hum. Comput. Stud..

[120]  Sebastian Boring,et al.  From Pulse Trains to "Coloring with Vibrations": Motion Mappings for Mid-Air Haptic Textures , 2018, CHI.

[121]  Ravin Balakrishnan,et al.  Summon and Select: Rapid Interaction with Interface Controls in Mid-air , 2017, ISS.

[122]  Christian Hansen,et al.  Touchless interaction with software in interventional radiology and surgery: a systematic literature review , 2017, International Journal of Computer Assisted Radiology and Surgery.

[123]  John Philbeck,et al.  Evaluation of gesture based interfaces for medical volume visualization tasks , 2011, VRCAI '11.

[124]  Ci Wang,et al.  User-Defined Gestures for Gestural Interaction: Extending from Hands to Other Body Parts , 2018, Int. J. Hum. Comput. Interact..

[125]  Jörg Müller,et al.  Cuenesics: using mid-air gestures to select items on interactive public displays , 2014, MobileHCI '14.

[126]  Abdulmotaleb El-Saddik,et al.  An Elicitation Study on Gesture Preferences and Memorability Toward a Practical Hand-Gesture Vocabulary for Smart Televisions , 2015, IEEE Access.

[127]  Olivier Chapuis,et al.  Mid-air pan-and-zoom on wall-sized displays , 2011, CHI.

[128]  Judy Kay,et al.  To Dwell or Not to Dwell: An Evaluation of Mid-Air Gestures for Large Information Displays , 2015, OZCHI.

[129]  Patrick Saalfeld,et al.  Comparison of gesture and conventional interaction techniques for interventional neuroradiology , 2017, International Journal of Computer Assisted Radiology and Surgery.

[130]  Ana M. Bernardos,et al.  A Comparison of Head Pose and Deictic Pointing Interaction Methods for Smart Environments , 2016, Int. J. Hum. Comput. Interact..

[131]  Tilbe Göksun,et al.  It Made More Sense: Comparison of User-Elicited On-skin Touch and Freehand Gesture Sets , 2017, HCI.

[132]  Nathaniel Rossol,et al.  Touchfree medical interfaces , 2014, 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[133]  Baptiste Caramiaux,et al.  Designing natural gesture interaction for archaeological data in immersive environments , 2017 .

[134]  A. Roberts,et al.  Informatics in Radiology: developing a touchless user interface for intraoperative image control during interventional radiology procedures. , 2013, Radiographics : a review publication of the Radiological Society of North America, Inc.