Comparing selection mechanisms for gaze input techniques in head-mounted displays

Abstract Head movements are a common input modality on VR/AR headsets. However, although they enable users to control a cursor, they lack an integrated method to trigger actions. Many approaches exist to fill this gap: dedicated ”clickers”, on-device buttons, mid-air gestures, dwell, speech and new input techniques based on matching head motions to those of visually presented targets. These proposals are diverse and there is a current lack of empirical data on the performance of, experience of, and preference for these different techniques. This hampers the ability of designers to select appropriate input techniques to deploy. We conduct two studies that address this problem. A Fitts’ Law study compares five traditional selection techniques and concludes that clicker (hands-on) and dwell (hands-free) provide optimal combinations of precision, speed and physical load. A follow-up study compares clicker and dwell to a motion matching implementation. While clicker remains fastest and dwell most accurate, motion matching may provide a valuable compromise between these two poles.

[1]  Howell O. Istance,et al.  Why are eye mice unpopular? A detailed comparison of head and eye controlled assistive technology pointing devices , 2003, Universal Access in the Information Society.

[2]  Robert J. K. Jacob,et al.  Interacting with eye movements in virtual environments , 2000, CHI.

[3]  Robert J. K. Jacob,et al.  What you look at is what you get: eye movement-based interaction techniques , 1990, CHI '90.

[4]  Roope Raisamo,et al.  Feedback for Smooth Pursuit Gaze Tracking Based Control , 2016, AH.

[5]  Amy R. Pritchett,et al.  Preliminary investigation of wearable computers for task guidance in aircraft inspection , 1998, Digest of Papers. Second International Symposium on Wearable Computers (Cat. No.98EX215).

[6]  Ravin Balakrishnan,et al.  Pressure widgets , 2004, CHI.

[7]  Ana M. Bernardos,et al.  A Comparison of Head Pose and Deictic Pointing Interaction Methods for Smart Environments , 2016, Int. J. Hum. Comput. Interact..

[8]  Rick Kjeldsen,et al.  Head gestures for computer control , 2001, Proceedings IEEE ICCV Workshop on Recognition, Analysis, and Tracking of Faces and Gestures in Real-Time Systems.

[9]  Thies Pfeiffer,et al.  Advantages of eye-gaze over head-gaze-based selection in virtual and augmented reality under varying field of views , 2018, COGAIN@ETRA.

[10]  Yuanchun Shi,et al.  Tap, Dwell or Gesture?: Exploring Head-Based Text Entry Techniques for HMDs , 2017, CHI.

[11]  Wolfgang Birkfellner,et al.  A head-mounted operating binocular for augmented reality visualization in medicine - design and initial evaluation , 2002, IEEE Transactions on Medical Imaging.

[12]  Hans-Werner Gellersen,et al.  MatchPoint: Spontaneous Spatial Coupling of Body Movement for Touchless Pointing , 2017, UIST.

[13]  Stephen A. Brewster,et al.  Gesture and voice prototyping for early evaluations of social acceptability in multimodal interfaces , 2010, ICMI-MLMI '10.

[14]  Woodrow Barfield,et al.  Evaluating the effectiveness of augmented reality displays for a manual assembly task , 1999, Virtual Reality.

[15]  Florian Alt,et al.  VRpursuits: interaction in virtual reality using smooth pursuit eye movements , 2018, AVI.

[16]  G. Borg Borg's Perceived Exertion and Pain Scales , 1998 .

[17]  Robert J. K. Jacob,et al.  Evaluation of eye gaze interaction , 2000, CHI.

[18]  Paul Lukowicz,et al.  WearIT@work: Toward Real-World Industrial Wearable Computing , 2007, IEEE Pervasive Computing.

[19]  I. Scott MacKenzie,et al.  Speech-augmented eye gaze interaction with small closely spaced targets , 2006, ETRA.

[20]  T. P. Caudell,et al.  Augmented reality: an application of heads-up display technology to manual manufacturing processes , 1992, Proceedings of the Twenty-Fifth Hawaii International Conference on System Sciences.

[21]  Roderick Murray-Smith,et al.  Pointing without a pointer , 2004, CHI EA '04.

[22]  Xincheng Li,et al.  One-Dimensional Handwriting: Inputting Letters and Words on Smart Glasses , 2016, CHI.

[23]  Robert J. Teather,et al.  The eyes don't have it: an empirical comparison of head-based and eye-based selection in virtual reality , 2017, SUI.

[24]  Boris M. Velichkovsky,et al.  Towards gaze-mediated interaction: Collecting solutions of the "Midas touch problem" , 1997, INTERACT.

[25]  Daniel Mendes,et al.  VRRRRoom: Virtual Reality for Radiologists in the Reading Room , 2017, CHI.

[26]  Hans-Werner Gellersen,et al.  AmbiGaze: Direct Control of Ambient Devices by Gaze , 2016, Conference on Designing Interactive Systems.

[27]  Florian Alt,et al.  DialPlate: Enhancing the Detection of Smooth Pursuits Eye Movements Using Linear Regression , 2018, ArXiv.

[28]  Antti Jylhä,et al.  Designing a Willing-to-Use-in-Public Hand Gestural Interaction Technique for Smart Glasses , 2016, CHI.

[29]  Hans-Werner Gellersen,et al.  Orbits: Gaze Interaction for Smart Watches using Smooth Pursuit Eye Movements , 2015, UIST.

[30]  Yves Guiard,et al.  The problem of consistency in the design of Fitts' law experiments: consider either target distance and width or movement form and scale , 2009, CHI.

[31]  Howell O. Istance,et al.  Snap clutch, a moded approach to solving the Midas touch problem , 2008, ETRA.

[32]  Ramesh Raskar,et al.  Augmented Reality Visualization for Laparoscopic Surgery , 1998, MICCAI.

[33]  Dan Witzner Hansen,et al.  Head and Eye Movement as Pointing Modalities for Eyewear Computers , 2014, 2014 11th International Conference on Wearable and Implantable Body Sensor Networks Workshops.

[34]  Xing-Dong Yang,et al.  Gluey: Developing a Head-Worn Display Interface to Unify the Interaction Experience in Distributed Display Environments , 2015, MobileHCI.

[35]  Robert W. Lindeman,et al.  Jedi ForceExtension: Telekinesis as a Virtual Reality interaction metaphor , 2017, 2017 IEEE Symposium on 3D User Interfaces (3DUI).

[36]  Thomas Pederson,et al.  MAGIC pointing for eyewear computers , 2015, SEMWEB.

[37]  Emmanuel Pietriga,et al.  High-precision pointing on large wall displays using small handheld devices , 2013, CHI.

[38]  Hans-Werner Gellersen,et al.  Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets , 2013, UbiComp.

[39]  Steven K. Feiner,et al.  WeARHand: Head-worn, RGB-D camera-based, bare-hand user interface with visually enhanced depth perception , 2014, 2014 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).

[40]  Tao Yang,et al.  Eye-Wearable Technology for Machine Maintenance: Effects of Display Position and Hands-free Operation , 2015, CHI.

[41]  J. Mauchly Significance Test for Sphericity of a Normal $n$-Variate Distribution , 1940 .

[42]  Katerina Mania,et al.  Binocular eye-tracking for the control of a 3D immersive multimedia user interface , 2015, 2015 IEEE 1st Workshop on Everyday Virtual Reality (WEVR).

[43]  Jong-Soo Choi,et al.  Wearable augmented reality system using gaze interaction , 2008, 2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality.

[44]  Steven Hogg,et al.  Wearable augmented virtual reality for enhancing information delivery in high precision defence assembly: an engineering case study , 2004, Virtual Reality.

[45]  Iina Aaltonen,et al.  Use of wearable and augmented reality technologies in industrial maintenance work , 2016, MindTrek.

[46]  I. Scott MacKenzie,et al.  A Fitts' law study of click and dwell interaction by gaze, head and mouse with a head-mounted display , 2018, COGAIN@ETRA.

[47]  Ian Oakley,et al.  SmoothMoves: Smooth Pursuits Head Movements for Augmented Reality , 2017, UIST.

[48]  Steven K. Feiner,et al.  Augmented reality in the psychomotor phase of a procedural task , 2011, 2011 10th IEEE International Symposium on Mixed and Augmented Reality.

[49]  Robert W. Lindeman,et al.  Exploring natural eye-gaze-based interaction for immersive virtual reality , 2017, 2017 IEEE Symposium on 3D User Interfaces (3DUI).

[50]  Maria Isabel Saludares,et al.  Interaction techniques using head gaze for virtual reality , 2016, 2016 IEEE Region 10 Symposium (TENSYMP).

[51]  Hans-Werner Gellersen,et al.  Pursuit calibration: making gaze calibration less tedious and more flexible , 2013, UIST.

[52]  Ivan Poupyrev,et al.  3D User Interfaces: Theory and Practice , 2004 .

[53]  Hans-Werner Gellersen,et al.  Remote Control by Body Movement in Synchrony with Orbiting Widgets , 2017, Proc. ACM Interact. Mob. Wearable Ubiquitous Technol..

[54]  Frank Biocca,et al.  Comparative effectiveness of augmented reality in object assembly , 2003, CHI '03.

[55]  P. Fitts The information capacity of the human motor system in controlling the amplitude of movement. , 1954, Journal of experimental psychology.

[56]  I. Scott MacKenzie,et al.  Towards a standard for pointing device evaluation, perspectives on 27 years of Fitts' law research in HCI , 2004, Int. J. Hum. Comput. Stud..

[57]  Mark Billinghurst,et al.  Pinpointing: Precise Head- and Eye-Based Target Selection for Augmented Reality , 2018, CHI.

[58]  Roel Vertegaal A Fitts Law comparison of eye tracking and manual input in the selection of visual targets , 2008, ICMI '08.

[59]  Jean-Daniel Fekete,et al.  Motion-pointing: target selection using elliptical motions , 2009, CHI.