Measuring the Spatial Noise of a Low-Cost Eye Tracker to Enhance Fixation Detection

The present study evaluates the quality of gaze data produced by a low-cost eye tracker (The Eye Tribe©, The Eye Tribe, Copenhagen, Denmark) in order to verify its suitability for the performance of scientific research. An integrated methodological framework, based on artificial eye measurements and human eye tracking data, is proposed towards the implementation of the experimental process. The obtained results are used to remove the modeled noise through manual filtering and when detecting samples (fixations). The outcomes aim to serve as a robust reference for the verification of the validity of low-cost solutions, as well as a guide for the selection of appropriate fixation parameters towards the analysis of experimental data based on the used low-cost device. The results show higher deviation values for the real test persons in comparison to the artificial eyes, but these are still acceptable to be used in a scientific setting.

[1]  Kenneth Holmqvist,et al.  A study of artificial eyes for the measurement of precision in eye-trackers , 2016, Behavior Research Methods.

[2]  Alexander V. Zhegallo,et al.  ETRAN—R Extension Package for Eye Tracking Results Analysis , 2015, Perception.

[3]  KimJaewon,et al.  Eye-tracking analysis of user behavior and performance in web search on large and small screens , 2015 .

[4]  Arthur M. Jacobs,et al.  OGAMA (Open Gaze and Mouse Analyzer): Open-source software designed to analyze eye and mouse movements in slideshow study designs , 2008, Behavior research methods.

[5]  Tamás D. Gedeon,et al.  Understanding eye movements on mobile devices for better presentation of search results , 2015, J. Assoc. Inf. Sci. Technol..

[6]  Marcus Nyström,et al.  The influence of calibration method and eye physiology on eyetracking data quality , 2013, Behavior research methods.

[7]  Stefano Federici,et al.  Gaze and eye-tracking solutions for psychological research , 2012, Cognitive Processing.

[8]  Marcus Nyström,et al.  Eye tracker data quality: what it is and how to measure it , 2012, ETRA.

[9]  Stanislav Popelka,et al.  EyeTribe Tracker Data Accuracy Evaluation and Its Interconnection with Hypothesis Software for Cartographic Purposes , 2016, Comput. Intell. Neurosci..

[10]  I. Hooge,et al.  Consequences of eye color, positioning, and head movement for eye-tracking data quality in infant research , 2015 .

[11]  P. Blignaut,et al.  Eye-tracking data quality as affected by ethnicity and experimental design , 2014, Behavior research methods.

[12]  Vijay Rajanna,et al.  GAWSCHI: gaze-augmented, wearable-supplemented computer-human interaction , 2016, ETRA.

[13]  Alan Kennedy,et al.  Book Review: Eye Tracking: A Comprehensive Guide to Methods and Measures , 2016, Quarterly journal of experimental psychology.

[14]  Agostino Gibaldi,et al.  Evaluation of the Tobii EyeX Eye tracking controller and Matlab toolkit for research , 2016, Behavior Research Methods.

[15]  Dongheng Li,et al.  openEyes: a low-cost head-mounted eye-tracking solution , 2006, ETRA.

[16]  Joseph H. Goldberg,et al.  Identifying fixations and saccades in eye-tracking protocols , 2000, ETRA.

[17]  John Paulin Hansen,et al.  Evaluation of a low-cost open-source gaze tracker , 2010, ETRA.

[18]  Erik D. Reichle,et al.  Eye movements in reading and information processing : , 2015 .

[19]  Mario Ferraro,et al.  Detecting expert’s eye using a multiple-kernel Relevance Vector Machine , 2014 .

[20]  Wolfgang Rosenstiel,et al.  Bayesian online clustering of eye movement data , 2012, ETRA.

[21]  Christoph Berger,et al.  GazeAlyze: a MATLAB toolbox for the analysis of eye movement data , 2012, Behavior research methods.

[22]  Katarzyna Harezlak,et al.  Evaluating Quality of Dispersion Based Fixation Detection Algorithm , 2014, ISCIS.

[23]  Michela Terenzi,et al.  ASTEF: A simple tool for examining fixations , 2008, Behavior research methods.

[24]  Pingmei Xu,et al.  TurkerGaze: Crowdsourcing Saliency with Webcam based Eye Tracking , 2015, ArXiv.

[25]  Byron Nakos,et al.  An Application of Eye Tracking Methodology in Cartographic Research , 2011 .

[26]  Elena Gaudioso,et al.  Evaluation of temporal stability of eye tracking algorithms using webcams , 2016, Expert Syst. Appl..

[27]  Pieter Blignaut,et al.  The effect of fixational eye movements on fixation identification with a dispersion-based fixation detection algorithm , 2009 .

[28]  B. Nakos,et al.  Investigating dynamic variables with eye movement analysis , 2013 .

[29]  Chris P. Brennan,et al.  Accessing Tele-Services Using a Hybrid BCI Approach , 2015, IWANN.

[30]  Frouke Hermens,et al.  Dummy eye measurements of microsaccades: Testing the influence of system noise and head movements on microsaccade detection in a popular video-based eye tracker , 2015 .

[31]  M. Johnson,et al.  Parsing eye-tracking data of variable quality to provide accurate fixation duration estimates in infants and adults , 2012, Behavior research methods.

[32]  Per Baekgaard,et al.  Thinking Outside of the Box or Enjoying Your 2 Seconds of Frame? , 2015, HCI.

[33]  Katarzyna Harezlak,et al.  Using Non-calibrated Eye Movement Data to Enhance Human Computer Interfaces , 2015, KES-IDT.

[34]  A. L. Yarbus Eye movements and vision Plenum Press , 1967 .

[35]  Linden J. Ball,et al.  Eye tracking in HCI and usability research. , 2006 .

[36]  Miguel A. Velasco,et al.  Accuracy and Precision of the Tobii X2-30 Eye-tracking under Non Ideal Conditions , 2014, NEUROTECHNIX.

[37]  John P. Oakley,et al.  Author Manuscript a Simple Nonparametric Method for Classifying Eye Fixations , 2022 .

[38]  Joseph H. Goldberg,et al.  Computer interface evaluation using eye movements: methods and constructs , 1999 .

[39]  Wolfgang Rosenstiel,et al.  On the necessity of adaptive eye movement classification in conditionally automated driving scenarios , 2016, ETRA.

[40]  Marcus Nyström,et al.  Detection of fixations and smooth pursuit movements in high-speed eye-tracking data , 2015, Biomed. Signal Process. Control..

[41]  Pawanesh Abrol,et al.  Eye Gaze Techniques for Human Computer Interaction: A Research Survey , 2013 .

[42]  Xiaojuan Ma,et al.  Social Eye Tracking: Gaze Recall with Online Crowds , 2015, CSCW.

[43]  Brian Scassellati,et al.  The incomplete fixation measure , 2008, ETRA.

[44]  Byron Nakos,et al.  EyeMMV toolbox: An eye movement post-analysis tool based on a two-step spatial dispersion threshold for fixation identification , 2014 .

[45]  Marcus Nyström,et al.  An adaptive algorithm for fixation, saccade, and glissade detection in eyetracking data , 2010, Behavior research methods.

[46]  Chen Yu,et al.  ExpertEyes: Open-source, high-definition eyetracking , 2015, Behavior research methods.

[47]  E. Reingold Eye Tracking Research and Technology: Towards Objective Measurement of Data Quality , 2014, Visual cognition.

[48]  P. Blignaut Fixation identification: The optimum threshold for a dispersion algorithm , 2009, Attention, perception & psychophysics.

[49]  Jennifer J. Vogel-Walcutt,et al.  A review of eye-tracking applications as tools for training , 2012, Cognition, Technology & Work.

[50]  Sophia Karagiorgou,et al.  Experimenting With Polylines on the Visualization of Eye Tracking Data From Observations of Cartographic Lines , 2014, ET4S@GIScience.

[51]  Radoslaw Mantiuk Accuracy of High-End and Self-build Eye-Tracking Systems , 2016, ACS.

[52]  Thiago Santini,et al.  Bayesian identification of fixations, saccades, and smooth pursuits , 2015, ETRA.

[53]  James Hays,et al.  WebGazer: Scalable Webcam Eye Tracking Using User Interactions , 2016, IJCAI.

[54]  Sebastiaan Mathôt,et al.  PyGaze: An open-source, cross-platform toolbox for minimal-effort programming of eyetracking experiments , 2014, Behavior research methods.

[55]  Tamás D. Gedeon,et al.  Eye‐tracking analysis of user behavior and performance in web search on large and small screens , 2015, J. Assoc. Inf. Sci. Technol..

[56]  Silvia Wen-Yu Lee,et al.  A review of using eye-tracking technology in exploring learning from 2000 to 2012 , 2013 .

[57]  Benjamin W. Tatler,et al.  Systematic tendencies in scene viewing , 2008 .

[58]  Tim J. Smith,et al.  GraFIX: A semiautomatic approach for parsing low- and high-quality eye-tracking data , 2014, Behavior Research Methods.

[59]  Darren R Gitelman,et al.  ILAB: A program for postexperimental eye movement analysis , 2002, Behavior research methods, instruments, & computers : a journal of the Psychonomic Society, Inc.

[60]  Quan Wang,et al.  Modified DBSCAN algorithm on oculomotor fixation identification , 2016, ETRA.

[61]  Nikos Fakotakis,et al.  On visual gaze tracking based on a single low cost camera , 2015, Signal Process. Image Commun..

[62]  Fernando Vilariño,et al.  Low Cost Eye Tracking: The Current Panorama , 2016, Comput. Intell. Neurosci..

[63]  Neil A. Dodgson,et al.  Rendering synthetic ground truth images for eye tracker evaluation , 2014, ETRA.

[64]  Naphtali Rishe,et al.  Finding an Efficient Threshold for Fixation Detection in Eye Gaze Tracking , 2016, HCI.

[65]  Tommy Strandvall,et al.  Eye Tracking in Human-Computer Interaction and Usability Research , 2009, INTERACT.

[66]  Tobias Höllerer,et al.  Spatio-Temporal Detection of Divided Attention in Reading Applications Using EEG and Eye Tracking , 2015, IUI.

[67]  R. Pieters,et al.  A Review of Eye-Tracking Research in Marketing , 2008 .

[68]  Sarah Weigelt,et al.  Online webcam-based eye tracking in cognitive science: A first look , 2017, Behavior Research Methods.

[69]  Fernando Vilariño,et al.  A cheap portable eye-tracker solution for common setups , 2014 .

[70]  Kristien Ooms,et al.  Accuracy and precision of fixation locations recorded with the low-cost Eye Tribe tracker in different experimental set- ups , 2015 .

[71]  Nancy Millette,et al.  How People Look at Pictures , 1935 .

[72]  Edwin S. Dalmaijer,et al.  Is the low-cost EyeTribe eye tracker any good for research? , 2014 .

[73]  Francesco Di Nocera,et al.  A Simple(r) Tool For Examining Fixations , 2016 .

[74]  Byron Nakos,et al.  Detection of moving point symbols on cartographic backgrounds , 2016 .