Displays for Effective Human-Agent Teaming: Evaluating Attention Management with Computational Models

In information-dense work domains, the effectiveness of display formats in drawing attention to task-relevant information is critical. In this paper, we demonstrate a method to evaluate this capability for on-screen indicators used to proactively monitor multiple automated agents. To estimate the effectiveness of indicator formats in drawing attention to emerging problems, we compared the visual salience of indicators, as measured by computational models, to task-relevant attributes needed during proactive monitoring. The results revealed that standard formats generally do not draw attention to the information needed to identify emerging problems in multi-indicator displays, and validated the success of formats designed to more closely map task-relevant information to visual salience. We additionally report an extended saliency-based monitoring model to predict task performance from saliency and discuss implications for broader design and application.

[1]  Mark W. Wiggins,et al.  Polychronicity and Information Acquisition in Pilot Decision Making , 2008 .

[2]  Masaaki Kawahashi,et al.  Renovation of Journal of Visualization , 2010, J. Vis..

[3]  K. Fujii,et al.  Visualization for the analysis of fluid motion , 2005, J. Vis..

[4]  J. G. Hollands,et al.  Engineering Psychology and Human Performance , 1984 .

[5]  K B Bennett,et al.  Graphical Displays: Implications for Divided Attention, Focused Attention, and Problem Solving , 1992, Human factors.

[6]  Miguel P Eckstein,et al.  Visual search: a retrospective. , 2011, Journal of vision.

[7]  Pietro Perona,et al.  Graph-Based Visual Saliency , 2006, NIPS.

[8]  P. Bullemer,et al.  The MPC elucidator: a case study in the design for human-automation interaction , 2002, IEEE Trans. Syst. Man Cybern. Part A.

[9]  Harvey S. Smallman,et al.  Increasing the Effective Span of Control: Advanced Graphics for Proactive, Trend-Based Monitoring , 2014 .

[10]  Frédo Durand,et al.  A Benchmark of Computational Models of Saliency to Predict Human Fixations , 2012 .

[11]  Catherine M. Burns,et al.  There Is More to Monitoring a Nuclear Power Plant than Meets the Eye , 2000, Hum. Factors.

[12]  Maia B. Cook,et al.  Displays for Effective Human-Agent Teaming: The Role of Information Availability and Attention Management , 2015, HCI.

[13]  Tim K Marks,et al.  SUN: A Bayesian framework for saliency using natural statistics. , 2008, Journal of vision.

[14]  Robin Welch,et al.  Information Rich Display Design , 2004 .

[15]  Christof Koch,et al.  A Model of Saliency-Based Visual Attention for Rapid Scene Analysis , 2009 .

[16]  J. Wolfe,et al.  What attributes guide the deployment of visual attention and how do they do it? , 2004, Nature Reviews Neuroscience.

[17]  Harvey S. Smallman,et al.  Proactive Supervisory Decision Support from Trend-Based Monitoring of Autonomous and Automated Systems: A Tale of Two Domains , 2013, HCI.

[18]  Jeremy M. Wolfe,et al.  The Rules of Guidance in Visual Search , 2012, PerMIn.

[19]  Catherine M. Burns Towards proactive monitoring in the petrochemical industry , 2006 .

[20]  Aykut Erdem,et al.  Visual saliency estimation by nonlinearly integrating features using region covariances. , 2013, Journal of vision.

[21]  A. Treisman,et al.  A feature-integration theory of attention , 1980, Cognitive Psychology.

[22]  Harvey S. Smallman,et al.  Staying Up to Speed: Four Design Principles for Maintaining and Recovering Situation Awareness , 2008 .

[23]  J. Duncan,et al.  Visual search and stimulus similarity. , 1989, Psychological review.