Eye Tracking for Dynamic, User-Driven Workflows

Researchers at Sandia National Laboratories in Albuquerque, New Mexico, are engaged in the empirical study of human-information interaction in high-consequence national security environments. This focus emerged from our longstanding interactions with military and civilian intelligence analysts working across a broad array of domains, from signals intelligence to cybersecurity to geospatial imagery analysis. In this paper, we discuss how several years’ of work with Synthetic Aperture Radar (SAR) imagery analysts revealed the limitations of eye tracking systems for capturing gaze events in the dynamic, user-driven problem-solving strategies characteristic of geospatial analytic workflows. We also explain the need for eye tracking systems capable of supporting inductive study of dynamic, user-driven problem-solving strategies characteristic of geospatial analytic workflows. We then discuss an ongoing project in which we are leveraging some of the unique properties of SAR image products to develop a prototype eyetracking data collection and analysis system that will support inductive studies of visual workflows in SAR image analysis environments.

[1]  Kait Clark,et al.  Assessing visual search performance differences between Transportation Security Administration Officers and nonprofessional visual searchers , 2013 .

[2]  J. Daniel Morrow,et al.  Ethnographic Methods for Experimental Design: Case Studies in Visual Search , 2015, HCI.

[3]  S. James Briggs,et al.  Tactical Photointerpreter Evaluations Of Hardcopy And Softcopy Imagery , 1978, Other Conferences.

[4]  Jeremy M. Wolfe,et al.  Guided Search 4.0: Current Progress With a Model of Visual Search , 2007, Integrated Models of Cognitive Systems.

[5]  Jonathan Tran,et al.  Using Eye Tracking Metrics and Visual Saliency Maps to Assess Image Utility , 2016, HVEI.

[6]  Peter Pirolli,et al.  ACT-R models of information foraging in geospatial intelligence tasks , 2015, Comput. Math. Organ. Theory.

[7]  Floyd W. Spencer Visual Inspection Research Project Report on Benchmark Inspections. , 1996 .

[8]  Susan M. Stevens-Adams,et al.  Applying Cognitive Work Analysis to a Synthetic Aperture Radar System , 2014, HCI.

[9]  Armin W. Doerry,et al.  Synthetic Aperture Radar , 1992, Inverse Synthetic Aperture Radar Imaging with MATLAB® Algorithms.

[10]  Bonnie A. Nardi,et al.  Information Ecologies: Using Technology with Heart , 1999 .

[11]  Colin G. Drury,et al.  Training for strategy in visual search , 1997 .

[12]  Ed H. Chi,et al.  Using information scent to model user information needs and actions and the Web , 2001, CHI.

[13]  Colin G. Drury,et al.  Extension of the visual search models of inspection , 2007 .

[14]  Kait Clark,et al.  Overcoming hurdles in translating visual search research between the lab and the field. , 2012, Nebraska Symposium on Motivation. Nebraska Symposium on Motivation.

[15]  Arthur F Kramer,et al.  Visual Skills in Airport-Security Screening , 2004, Psychological science.

[16]  A. Treisman,et al.  A feature-integration theory of attention , 1980, Cognitive Psychology.

[17]  Colin G. Drury,et al.  Training strategies for visual inspection , 1997 .

[18]  E D Megaw,et al.  Factors affecting visual inspection accuracy. , 1979, Applied ergonomics.

[19]  Laura A. McNamara,et al.  Context-sensitive design and human interaction principles for usable, useful, and adoptable radars , 2016, SPIE Defense + Security.

[20]  Susan M. Stevens-Adams,et al.  Hierarchical Task Analysis of a Synthetic Aperture Radar Analysis Process , 2014, HCI.

[21]  E D Megaw,et al.  Eye movements and industrial inspection. , 1979, Applied ergonomics.

[22]  Andrew T Duchowski,et al.  A breadth-first survey of eye-tracking applications , 2002, Behavior research methods, instruments, & computers : a journal of the Psychonomic Society, Inc.

[23]  Ed H. Chi,et al.  Visual Foraging of Highlighted Text: An Eye-Tracking Study , 2007, HCI.

[24]  Judi E. See,et al.  Visual inspection : a review of the literature. , 2012 .