The eyes don't have it: an empirical comparison of head-based and eye-based selection in virtual reality

We present a study comparing selection performance between three eye/head interaction techniques using the recently released FOVE head-mounted display (HMD). The FOVE offers an integrated eye tracker, which we use as an alternative to potentially fatiguing and uncomfortable head-based selection used with other commercial devices. Our experiment was modelled after the ISO 9241-9 reciprocal selection task, with targets presented at varying depths in a custom virtual environment. We compared eye-based selection, and head-based selection (i.e., gaze direction) in isolation, and a third condition which used both eye-tracking and head-tracking at once. Results indicate that eye-only selection offered the worst performance in terms of error rate, selection times, and throughput. Head-only selection offered significantly better performance.

[1]  Mark R. Mine,et al.  Virtual Environment Interaction Techniques , 1995 .

[2]  Robert J. K. Jacob,et al.  Evaluation of eye gaze interaction , 2000, CHI.

[3]  Karin Coninx,et al.  Exploring the Effects of Environment Density and Target Visibility on Object Selection in 3D Virtual Environments , 2007, 2007 IEEE Symposium on 3D User Interfaces.

[4]  Emilio Bizzi,et al.  The coordination of eye and head movement during smooth pursuit , 1978, Brain Research.

[5]  Roel Vertegaal,et al.  EyeWindows: evaluation of eye-controlled zooming windows for focus selection , 2005, CHI.

[6]  David L. Akin,et al.  Assessment of Fitts’ Law for Quantifying Combined Rotational and Translational Movements , 2010, Hum. Factors.

[7]  Doug A. Bowman,et al.  An evaluation of techniques for grabbing and manipulating remote objects in immersive virtual environments , 1997, SI3D.

[8]  Roel Vertegaal A Fitts Law comparison of eye tracking and manual input in the selection of visual targets , 2008, ICMI '08.

[9]  Robert J. Teather,et al.  Pointing at 3d target projections with one-eyed and stereo cursors , 2013, CHI.

[10]  Robert J. Teather,et al.  FittsTilt: the application of Fitts' law to tilt-based interaction , 2012, NordiCHI.

[11]  Ivan Poupyrev,et al.  A framework and testbed for studying manipulation techniques for immersive VR , 1997, VRST '97.

[12]  A BowmanDoug,et al.  A human motor behavior model for distal pointing tasks , 2010 .

[13]  Ivan Poupyrev,et al.  The go-go interaction technique: non-linear mapping for direct manipulation in VR , 1996, UIST '96.

[14]  Xuan Zhang,et al.  Evaluating Eye Tracking with ISO 9241 - Part 9 , 2007, HCI.

[15]  I. Scott MacKenzie,et al.  Evaluating Eye Tracking Systems for Computer Input , 2012 .

[16]  Robin Wolff,et al.  Eye-tracking for avatar eye-gaze and interactional analysis in immersive collaborative virtual environments , 2008, CSCW.

[17]  Richard A. Bolt,et al.  Gaze-orchestrated dynamic windows , 1981, SIGGRAPH '81.

[18]  P. Fitts The information capacity of the human motor system in controlling the amplitude of movement. , 1954, Journal of experimental psychology.

[19]  I. Scott MacKenzie,et al.  Towards a standard for pointing device evaluation, perspectives on 27 years of Fitts' law research in HCI , 2004, Int. J. Hum. Comput. Stud..

[20]  Ivan Poupyrev,et al.  An Introduction to 3-D User Interface Design , 2001, Presence: Teleoperators & Virtual Environments.

[21]  Päivi Majaranta,et al.  Gaze Interaction and Applications of Eye Tracking - Advances in Assistive Technologies , 2011 .

[22]  James D. Foley,et al.  The human factors of computer graphics interaction techniques , 1984, IEEE Computer Graphics and Applications.

[23]  Robert J. K. Jacob,et al.  What you look at is what you get: eye movement-based interaction techniques , 1990, CHI '90.

[24]  Helge J. Ritter,et al.  Automatic analysis of 3D gaze coordinates on scene objects using data from eye-tracking and motion-capture systems , 2012, ETRA '12.

[25]  Wolfgang Stuerzlinger,et al.  The performance of un-instrumented in-air pointing , 2014, Graphics Interface.

[26]  Doug A. Bowman,et al.  A human motor behavior model for distal pointing tasks , 2010, Int. J. Hum. Comput. Stud..

[27]  Robert J. Teather,et al.  Pointing at 3D targets in a stereo head-tracked virtual environment , 2011, 2011 IEEE Symposium on 3D User Interfaces (3DUI).

[28]  Hideyuki Tamura,et al.  Gaze-directed adaptive rendering for interacting with virtual space , 1996, Proceedings of the IEEE 1996 Virtual Reality Annual International Symposium.

[29]  Andrew S. Forsberg,et al.  Image plane interaction techniques in 3D immersive environments , 1997, SI3D.

[30]  Shumin Zhai,et al.  Input techniques for HCI in 3D environments , 1994, CHI '94.

[31]  Jinseok Seo,et al.  Evaluation of pointing techniques for ray casting selection in virtual environments , 2003, International Conference On Virtual Reality and Its Applications in Industry.

[32]  Anand K. Gramopadhye,et al.  Binocular eye tracking in VR for visual inspection training , 2001, VRST '01.

[33]  Richard A. Bolt Eyes at the interface , 1982, CHI '82.

[34]  Richard A. Bolt,et al.  A gaze-responsive self-disclosing display , 1990, CHI '90.