A comparison of eye-head coordination between virtual and physical realities

Past research has shown that humans exhibit certain eye-head responses to the appearance of visual stimuli, and these natural reactions change during different activities. Our work builds upon these past observations by offering new insight to how humans behave in Virtual Reality (VR) compared to Physical Reality (PR). Using eye- and head- tracking technology, and by conducting a study on two groups of users - participants in VR or PR - we identify how often these natural responses are observed in both environments. We find that users statistically move their heads more often when viewing stimuli in VR than in PR, and VR users also move their heads more in the presence of text. We open a discussion for identifying the HWD factors that cause this difference, as this may not only affect predictive models using eye movements as features, but also VR user experience overall.

[1]  R. Mourant,et al.  Predictive Head-Movements during Automobile Mirror-Sampling , 1977, Perceptual and motor skills.

[2]  T. Metin Sezgin,et al.  Gaze-based predictive user interfaces: Visualizing user intentions in the presence of uncertainty , 2018, Int. J. Hum. Comput. Stud..

[3]  Alan F. Blackwell,et al.  Dasher—a data entry interface using continuous gestures and language models , 2000, UIST '00.

[4]  A. L. Yarbus,et al.  Eye Movements and Vision , 1967, Springer US.

[5]  Robert J. K. Jacob,et al.  What you look at is what you get: eye movement-based interaction techniques , 1990, CHI '90.

[6]  Robert J. K. Jacob,et al.  What you look at is what you get , 2016, Interactions.

[7]  Julie C Rice,et al.  Consumer Appeal of Injection IML Packaging vs. Similarly Decorated Glass Jars, Composite Cans, and Metal Cans Using Eye Tracking Technology , 2017 .

[8]  Michael F. Land,et al.  Predictable eye-head coordination during driving , 1992, Nature.

[9]  R. Dodge,et al.  The Latent Time of Compensatory Eye-movements. , 1921 .

[10]  James F. Knight,et al.  Neck muscle activity and perceived pain and discomfort due to variations of head load and posture. , 2004, Aviation, space, and environmental medicine.

[11]  Anthony J. Hornof,et al.  EyeDraw: enabling children with severe motor impairments to draw with their eyes , 2005, CHI.

[12]  Thomas Alexander,et al.  Prolonged work with head mounted displays , 2014, ISWC '14 Adjunct.

[13]  A. L. I︠A︡rbus Eye Movements and Vision , 1967 .

[14]  E Bizzi,et al.  The coordination of eye-head movements. , 1974, Scientific American.

[15]  Henna Heikkilä EyeSketch: a drawing application for gaze control , 2013, ETSA '13.

[16]  David J. Ward,et al.  Fast Hands-free Writing by Gaze Direction , 2002, ArXiv.

[17]  Andrew T. Duchowski Incorporating the viewer's point of regard (POR) in gaze-contingent virtual environments , 1998, Electronic Imaging.

[18]  Andrew T. Duchowski,et al.  Gaze-controlled gaming: immersive and difficult but not cognitively overloading , 2014, UbiComp Adjunct.

[19]  Oleg Spakov,et al.  Gaze controlled games , 2009, Universal Access in the Information Society.

[20]  Thomas S. Tullis,et al.  Generation Y, web design, and eye tracking , 2010, Int. J. Hum. Comput. Stud..

[21]  B. Shackel Note on mobile eye viewpoint recording. , 1960, Journal of the Optical Society of America.

[22]  Victoria Interrante,et al.  Predicting destination using head orientation and gaze direction during locomotion in VR , 2016, SAP.

[23]  Anthony J. Hornof,et al.  Eyedraw: a system for drawing pictures with eye movements , 2003, ASSETS.

[24]  Robert J. K. Jacob,et al.  Evaluation of eye gaze interaction , 2000, CHI.

[25]  F Ottaviani,et al.  Vestibulo-ocular reflex modification after virtual environment exposure. , 2001, Acta oto-laryngologica.