Region of interest generation algorithms for eye tracking data

Using human fixation behavior, we can interfere regions that require to be processed at high resolution and where stronger compression can be favored. Analyzing the visual scan path solely based on a predefined set of regions of interest (ROIs) limits the exploration room of the analysis. Insights can only be gained for those regions that the data analyst considered worthy of labeling. Furthermore, visual exploration is naturally time-dependent: A short initial overview phase may be followed by an in-depth analysis of regions that attracted the most attention. Therefore, the shape and size of regions of interest may change over time. Automatic ROI generation can help in automatically reshaping the ROIs to the data of a time slice. We developed three novel methods for automatic ROI generation and show their applicability to different eye tracking data sets. The methods are publicly available as part of the EyeTrace software at http://www.ti.uni-tuebingen.de/Eyetrace.175L0.html

[1]  Bing Pan,et al.  The determinants of web page viewing behavior: an eye-tracking study , 2004, ETRA.

[2]  B. Tatler,et al.  Yarbus, eye movements, and vision , 2010, i-Perception.

[3]  Michael Burch,et al.  State-of-the-Art of Visualization for Eye Tracking Data , 2014, EuroVis.

[4]  Douglas DeCarlo,et al.  Robust clustering of eye movement recordings for quantification of visual interest , 2004, ETRA.

[5]  Benjamin Strobel,et al.  Task-irrelevant data impair processing of graph reading tasks: An eye tracking study , 2018, Learning and Instruction.

[6]  Rafal Mantiuk,et al.  Gaze‐driven Object Tracking for Real Time Rendering , 2013, Comput. Graph. Forum.

[7]  Wolfgang Rosenstiel,et al.  Analysis of Eye Movements with Eyetrace , 2015, BIOSTEC.

[8]  Marcus Nyström,et al.  Off-line Foveated Compression and Scene Perception: An Eye-Tracking Approach , 2008 .

[9]  David S Wooding,et al.  Eye movements of large populations: II. Deriving regions of interest, coverage, and similarity using fixation maps , 2002, Behavior research methods, instruments, & computers : a journal of the Psychonomic Society, Inc.

[10]  S. Baron-Cohen,et al.  Oxytocin increases eye contact during a real-time, naturalistic social interaction in males with and without autism , 2015, Translational Psychiatry.

[11]  Claudio M. Privitera,et al.  Evaluating image processing algorithms that predict regions of interest , 1998, Pattern Recognit. Lett..

[12]  Thomas Ertl,et al.  Circular heat map transition diagram , 2013, ETSA '13.

[13]  Joseph H. Goldberg,et al.  Eye tracking in web search tasks: design implications , 2002, ETRA.

[14]  Harold Fox,et al.  Evaluating look-to-talk: a gaze-aware interface in a collaborative environment , 2002, CHI Extended Abstracts.

[15]  Wolfgang Rosenstiel,et al.  Bayesian online clustering of eye movement data , 2012, ETRA.

[16]  Claudio M. Privitera,et al.  Algorithms for Defining Visual Regions-of-Interest: Comparison with Eye Fixations , 2000, IEEE Trans. Pattern Anal. Mach. Intell..

[17]  Raphael Rosenberg Blicke messen : Vorschläge für eine empirische Bildwissenschaft , 2014 .

[18]  Andrew T Duchowski,et al.  A breadth-first survey of eye-tracking applications , 2002, Behavior research methods, instruments, & computers : a journal of the Psychonomic Society, Inc.

[19]  Gastón Ares,et al.  Influence of cognitive style on information processing and selection of yogurt labels: Insights from an eye-tracking study. , 2015, Food research international.

[20]  Worthy N. Martin,et al.  Human-computer interaction using eye-gaze input , 1989, IEEE Trans. Syst. Man Cybern..

[21]  Cristina Conati,et al.  Individual user characteristics and information visualization: connecting the dots through eye tracking , 2013, CHI.