Gazecode: open-source software for manual mapping of mobile eye-tracking data

Purpose: Eye movements recorded with mobile eye trackers generally have to be mapped to the visual stimulus manually. Manufacturer software usually has sub-optimal user interfaces. Here, we compare our in-house developed open-source alternative to the manufacturer software, called GazeCode. Method: 330 seconds of eye movements were recorded with the Tobii Pro Glasses 2. Eight coders subsequently categorized fixations using both Tobii Pro Lab and GazeCode. Results: Average manual mapping speed was more than two times faster when using GazeCode (0.649 events/s) compared with Tobii Pro Lab (0.292 events/s). Inter-rater reliability (Cohen's Kappa) was similar and satisfactory; 0.886 for Tobii Pro Lab and 0.871 for GazeCode. Conclusion: GazeCode is a faster alternative to Tobii Pro Lab for mapping eye movements to the visual stimulus. Moreover, it accepts eye-tracking data from manufacturers SMI, Positive Science, Tobii, and Pupil Labs.

[1]  Ignace T. C. Hooge,et al.  Measuring gaze patterns during colonoscopy: a useful tool to evaluate colon inspection? , 2016, European journal of gastroenterology & hepatology.

[2]  M. Land,et al.  The Roles of Vision and Eye Movements in the Control of Activities of Daily Living , 1998, Perception.

[3]  Joseph H. Goldberg,et al.  Identifying fixations and saccades in eye-tracking protocols , 2000, ETRA.

[4]  Jacob Cohen A Coefficient of Agreement for Nominal Scales , 1960 .

[5]  Jeff B. Pelz,et al.  3D point-of-regard, position and head orientation from a portable monocular video-based eye tracker , 2008, ETRA '08.

[6]  Ignace Hooge,et al.  Scan path entropy and arrow plots: capturing scanning behavior of multiple observers , 2013, Front. Psychol..

[7]  M. Hayhoe,et al.  In what ways do eye movements contribute to everyday activities? , 2001, Vision Research.

[8]  Kasey C. Soska,et al.  Head-mounted eye tracking: a new method to describe infant looking. , 2011, Child development.

[9]  Oleg V. Komogortsev,et al.  Standardization of Automated Analyses of Oculomotor Fixation and Saccadic Behaviors , 2010, IEEE Transactions on Biomedical Engineering.

[10]  G LoweDavid,et al.  Distinctive Image Features from Scale-Invariant Keypoints , 2004 .

[11]  Martin Raubal,et al.  Where Am I? Investigating Map Matching During Self‐Localization With Mobile Eye Tracking in an Urban Environment , 2014, Trans. GIS.

[12]  J. R. Landis,et al.  The measurement of observer agreement for categorical data. , 1977, Biometrics.

[13]  Toon Goedemé,et al.  Towards a more effective method for analyzing mobile eye-tracking data: integrating gaze data with object recognition algorithms , 2011, PETMEI '11.

[14]  Jeff B. Pelz,et al.  Fixation-identification in dynamic scenes: comparing an automated algorithm to manual coding , 2008, APGV '08.

[15]  Jakob Nielsen,et al.  Heuristic Evaluation of Prototypes (individual) , 2022 .

[16]  David A. McAllester,et al.  Object Detection with Discriminatively Trained Part Based Models , 2010, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[17]  Josef Stoll,et al.  Validation of mobile eye-tracking as novel and efficient means for differentiating progressive supranuclear palsy from Parkinson's disease , 2012, Front. Behav. Neurosci..