Real-time recording and classification of eye movements in an immersive virtual environment.

Despite the growing popularity of virtual reality environments, few laboratories are equipped to investigate eye movements within these environments. This primer is intended to reduce the time and effort required to incorporate eye-tracking equipment into a virtual reality environment. We discuss issues related to the initial startup and provide algorithms necessary for basic analysis. Algorithms are provided for the calculation of gaze angle within a virtual world using a monocular eye-tracker in a three-dimensional environment. In addition, we provide algorithms for the calculation of the angular distance between the gaze and a relevant virtual object and for the identification of fixations, saccades, and pursuit eye movements. Finally, we provide tools that temporally synchronize gaze data and the visual stimulus and enable real-time assembly of a video-based record of the experiment using the Quicktime MOV format, available at http://sourceforge.net/p/utdvrlibraries/. This record contains the visual stimulus, the gaze cursor, and associated numerical data and can be used for data exportation, visual inspection, and validation of calculated gaze movements.

[1]  Joseph H. Goldberg,et al.  Identifying fixations and saccades in eye-tracking protocols , 2000, ETRA.

[2]  Mary Hayhoe,et al.  Differential impact of partial cortical blindness on gaze strategies when sitting and walking – An immersive virtual reality study , 2011, Vision Research.

[3]  D. Robinson,et al.  The upper limit of human smooth pursuit velocity , 1985, Vision Research.

[4]  Rochelle Ackerley,et al.  The interaction of visual, vestibular and extra-retinal mechanisms in the control of head and gaze during head-free pursuit , 2011, The Journal of physiology.

[5]  G. Barnes,et al.  Cognitive processes involved in smooth pursuit eye movements , 2008, Brain and Cognition.

[6]  Jean-Jacques Orban de Xivry,et al.  Saccades and pursuit: two outcomes of a single sensorimotor process , 2007, The Journal of physiology.

[7]  Berthold K. P. Horn,et al.  Closed-form solution of absolute orientation using unit quaternions , 1987 .

[8]  Jeff B. Pelz,et al.  Predictive eye movements in natural vision , 2011, Experimental Brain Research.

[9]  Marcus Nyström,et al.  An adaptive algorithm for fixation, saccade, and glissade detection in eyetracking data , 2010, Behavior research methods.

[10]  Mary Hayhoe,et al.  Saccades to future ball location reveal memory-based prediction in a virtual-reality interception task. , 2013, Journal of vision.

[11]  Hunter A. Murphy,et al.  3-D eye movement analysis , 2002, Behavior research methods, instruments, & computers : a journal of the Psychonomic Society, Inc.

[12]  Anand K. Gramopadhye,et al.  3D eye movement analysis for VR visual inspection training , 2002, ETRA.

[13]  R J Leigh,et al.  Measuring eye movements during locomotion: filtering techniques for obtaining velocity signals from a video-based eye monitor. , 1996, Journal of vestibular research : equilibrium & orientation.