Visualizing Dynamic Ambient/Focal Attention with Coefficient $$K$$

Using coefficient \(\mathcal{K}\), defined on a parametric scale, derived from processing a traditionally eye-tracked time course of eye movements, we propose a straightforward method of visualizing ambient/focal fixations in both scanpath and heatmap visualizations. The \(\mathcal{K}\) coefficient indicates the difference of fixation duration and following saccade amplitude expressed in standard deviation units, facilitating parametric statistical testing. Positive and negative ordinates of \(\mathcal{K}\) indicate focal or ambient fixations, respectively, and are colored by luminance variation depicting relative intensity of focal fixation.

[1]  Michael Burch,et al.  State-of-the-Art of Visualization for Eye Tracking Data , 2014, EuroVis.

[2]  Bernice E. Rogowitz,et al.  Data visualization: the end of the rainbow , 1998 .

[3]  A. Treisman,et al.  A feature-integration theory of attention , 1980, Cognitive Psychology.

[4]  D. E. Irwin,et al.  Eye movements and scene perception: Memory for things observed , 2002, Perception & psychophysics.

[5]  Raj M Ratwani,et al.  Thinking graphically: Connecting vision and cognition during graph comprehension. , 2008, Journal of experimental psychology. Applied.

[6]  B. Velichkovsky,et al.  Eye typing in application: A comparison of two interfacing systems with ALS patients , 2008 .

[7]  Pilar Orero,et al.  Aggregate gaze visualization with real-time heatmaps , 2012, ETRA.

[8]  D. S. Wooding,et al.  Fixation maps: quantifying eye-movement traces , 2002, ETRA.

[9]  Raj M. Ratwani,et al.  Cognitive Models of the Influence of Color Scale on Data Visualization Tasks , 2009, Hum. Factors.

[10]  B. Velichkovsky,et al.  Two Visual Systems and Their Eye Movements: Evidence from Static and Dynamic Scene Perception , 2005 .

[11]  David Borland,et al.  Rainbow Color Map (Still) Considered Harmful , 2007 .

[12]  Anand K. Gramopadhye,et al.  Gaze-augmented think-aloud as an aid to learning , 2012, CHI.

[13]  Elizabeth A. Krupinski,et al.  Recording and analyzing eye-position data using a microcomputer workstation , 1992 .

[14]  Thierry Baccino,et al.  New insights into ambient and focal visual fixations using an automatic classification algorithm , 2011, i-Perception.

[15]  Frédo Durand,et al.  A Fast Approximation of the Bilateral Filter Using a Signal Processing Approach , 2006, ECCV.

[16]  Andrew T. Duchowski,et al.  Discerning Ambient/Focal Attention with Coefficient K , 2016, ACM Trans. Appl. Percept..

[17]  Claudia Mello-Thoms,et al.  What attracts the eye to the location of missed and reported breast cancers? , 2002, ETRA.

[18]  H. Nothdurft Focal attention in visual search , 1999, Vision Research.

[19]  B. Velichkovsky,et al.  Time course of information processing during scene perception: The relationship between saccade amplitude and fixation duration , 2005 .

[20]  Andrew T. Duchowski,et al.  Audio description as an aural guide of children's visual attention: evidence from an eye-tracking study , 2012, ETRA.

[21]  J. Helmert,et al.  Visual Fixation Durations and Saccade Amplitudes: Shifting Relationship in a Variety of Conditions , 2008 .

[22]  Krzysztof Krejtz,et al.  Shall we care about the user's feelings?: influence of affect and engagement on visual attention , 2013, MIDI '13.

[23]  Arzu Çöltekin,et al.  High-Level Gaze Metrics From Map Viewing - Charting Ambient/Focal Visual Attention , 2014, ET4S@GIScience.

[24]  H. Ritter,et al.  Disambiguating Complex Visual Information: Towards Communication of Personal Views of a Scene , 1996, Perception.

[25]  John M. Henderson,et al.  Clustering of Gaze During Dynamic Scene Viewing is Predicted by Motion , 2011, Cognitive Computation.

[26]  Jeffrey Heer,et al.  Selecting Semantically‐Resonant Colors for Data Visualization , 2013, Comput. Graph. Forum.