Evaluating the data quality of the Gazepoint GP3 low-cost eye tracker when used independently by study participants

The portability of low-cost eye trackers makes them attractive for research outside of the laboratory. Such research may require independent eye-tracker use. The present work compared the data quality of the Gazepoint GP3 when used independently by research participants with expert eye-tracking users. Twenty participants completed a training and a testing session 1 week apart. At training visits, participants were taught how to set up and use eye-tracking hardware and software and how to complete two tasks: a calibration task to measure accuracy and precision, as well as a visual search task to assess target fixations. At the testing session, participants set up the Gazepoint eye tracker and completed the two tasks without assistance. Participant accuracy and precision and visual search performance were compared to values obtained from two expert eye-tracking users. Additionally, the eye-tracker sampling rate, which is sensitive to factors such as head motion, was assessed in both participants and the expert users. Participant accuracy and precision closely approximated expert user values. Participant target fixations were detected with a 92.5% sensitivity and 76.8% specificity, closely mirroring expert user sensitivity and specificity. The sampling rate distribution was also similar between the participants and expert user (the means of those distributions were 16.99 ± 3.0 ms and 16.43 ± 2.3 ms, respectively). When used independently, data quality obtained from a low-cost, portable eye-tracking setup closely approximated values obtained from an expert user and was adequate enough to be a feasible option for some studies that require independent use by study participants.

[1]  Joseph T. Coyne,et al.  Augmenting Traditional Performance Analyses with Eye Tracking Metrics , 2020 .

[2]  Eric T. Greenlee,et al.  Which Eye Tracker Is Right for Your Research? Performance Evaluation of Several Cost Variant Eye Trackers , 2016 .

[3]  Matt Field,et al.  A meta-analytic investigation of the relationship between attentional bias and subjective craving in substance abuse. , 2009, Psychological bulletin.

[4]  K. Rayner The 35th Sir Frederick Bartlett Lecture: Eye movements and attention in reading, scene perception, and visual search , 2009, Quarterly journal of experimental psychology.

[5]  Eni S. Becker,et al.  Can’t Look Away: An Eye-Tracking Based Attentional Disengagement Training for Depression , 2016, Cognitive Therapy and Research.

[6]  Ralf Engbert,et al.  Microsaccades uncover the orientation of covert attention , 2003, Vision Research.

[7]  Marcus Nyström,et al.  The influence of calibration method and eye physiology on eyetracking data quality , 2013, Behavior research methods.

[8]  Stephen Bottos,et al.  An Approach to Track Reading Progression Using Eye-Gaze Fixation Points , 2019, ArXiv.

[9]  David I. Donaldson,et al.  Understanding Minds in Real-World Environments: Toward a Mobile Cognition Approach , 2017, Front. Hum. Neurosci..

[10]  Lindsey Konkel,et al.  Racial and Ethnic Disparities in Research Studies: The Challenge of Creating More Diverse Cohorts , 2015, Environmental health perspectives.

[11]  A. Guastella,et al.  Attentional bias modification facilitates attentional control mechanisms: Evidence from eye tracking , 2015, Biological Psychology.

[12]  Jaka Sodnik,et al.  An analysis of the suitability of a low-cost eye tracker for assessing the cognitive load of drivers. , 2018, Applied ergonomics.

[13]  Joseph T. Coyne,et al.  Investigating the Use of Two Low Cost Eye Tracking Systems for Detecting Pupillary Response to Changes in Mental Workload , 2016 .

[14]  Hava T. Siegelmann,et al.  EyeFrame: Real-Time Memory Aid Improves Human Multitasking via Domain-General Eye Tracking Procedures , 2015, Front. ICT.

[15]  Joseph T. Coyne,et al.  Low Cost Eye Tracking: Ready for Individual Differences Research? , 2018, Proceedings of the Human Factors and Ergonomics Society Annual Meeting.

[16]  Joseph T. Coyne,et al.  Practical Considerations for Low-Cost Eye Tracking: An Analysis of Data Loss and Presentation of a Solution , 2017, HCI.

[17]  Marcus Nyström,et al.  Eye tracker data quality: what it is and how to measure it , 2012, ETRA.

[18]  Kenneth Holmqvist,et al.  Eye tracking: a comprehensive guide to methods and measures , 2011 .

[19]  Edwin S. Dalmaijer,et al.  Is the low-cost EyeTribe eye tracker any good for research? , 2014 .

[20]  Sebastiaan Mathôt,et al.  PyGaze: An open-source, cross-platform toolbox for minimal-effort programming of eyetracking experiments , 2014, Behavior research methods.

[21]  D. E. Irwin,et al.  Visual Memory Within and Across Fixations , 1992 .

[22]  Travis D. Masterson,et al.  Measuring attentional bias to food cues in young children using a visual search task: An eye-tracking study , 2020, Appetite.

[23]  Mariano Sigman,et al.  Fixation-related potentials in visual search: a combined EEG and eye tracking study. , 2012, Journal of vision.

[24]  Wouter Kruijne,et al.  Implicit short- and long-term memory direct our gaze in visual search , 2016, Attention, Perception, & Psychophysics.

[25]  Gina M. Notaro,et al.  Development and demonstration of an integrated EEG, eye-tracking, and behavioral data acquisition system to assess online learning , 2018, ICETC '18.

[26]  Marco Pollanen,et al.  Problem Solving as a Path to Comprehension , 2020, Math. Comput. Sci..

[27]  Jan Theeuwes,et al.  OpenSesame: An open-source, graphical experiment builder for the social sciences , 2011, Behavior Research Methods.

[28]  E. Gordon,et al.  Defining the temporal threshold for ocular fixation in free-viewing visuocognitive tasks , 2003, Journal of Neuroscience Methods.

[29]  Joao Sarraipa,et al.  EYE-TRACKING STUDENT'S BEHAVIOUR FOR E-LEARNING IMPROVEMENT , 2019 .

[30]  M. Behrends,et al.  Portable Infrared Pupillometry: A Review , 2015, Anesthesia and analgesia.

[31]  Kingshuk Chakravarty,et al.  Enhancing the usability of low-cost eye trackers for rehabilitation applications , 2018, PloS one.