A 'Trust But Verify' Design for Course of Action Displays

Abstract : Automation, particularly of complex cognitive tasks, is bound to be incomplete, simplistic, or otherwise less than completely reliable. Recently, we have begun developing Trust but Verify techniques for increasing the effectiveness of even unreliable automation. The user's trust should be conditioned on known situational factors that affect the reliability of the automation, and users should be able to verify the automation results and operation to various qualitative degrees as the level of trust dictates. Here, we describe our preliminary work on these concepts in the domain of Course of Action (COA) selection for an Intruder Interception Task. This task involves deciding which of several available aircraft should be chosen to perform an interception of an unknown aircraft intruding into the air space. Based on repeated interviews with four subject matter experts, we identified and then distilled a set of factors essential to evaluating the optimal COA. We then designed a set of alternative displays to illustrate the factors based on the Trust but Verify concept and general human factors display guidance. Here we analyze the benefits and costs of two major design decisions: whether to display the COA factors using a tabular or graphic organization, and whether or how to integrate the COAs with the map or with each other in a common table.

[1]  John D. Lee,et al.  Trust, self-confidence, and operators' adaptation to automation , 1994, Int. J. Hum. Comput. Stud..

[2]  C. Wickens Engineering psychology and human performance, 2nd ed. , 1992 .

[3]  George L. Kaempf,et al.  Decision Making in the AEGIS Combat Information Center , 1993 .

[4]  Christopher D. Wickens,et al.  A model for types and levels of human interaction with automation , 2000, IEEE Trans. Syst. Man Cybern. Part A.

[5]  Mustapha Mouloua,et al.  Automation and Human Performance : Theory and Applications , 1996 .

[6]  Raja Parasuraman,et al.  Fuzzy Signal Detection Theory: Basic Postulates and Formulas for Analyzing Human and Machine Performance , 2000, Hum. Factors.

[7]  Christopher D. Wickens,et al.  The Proximity Compatibility Principle: Its Psychological Foundation and Relevance to Display Design , 1995, Hum. Factors.

[8]  Thomas A. Dingus,et al.  Human Factors Field Evaluation of Automotive Headway Maintenance/Collision Warning Devices , 1997, Hum. Factors.

[9]  Barry Kirwan,et al.  A Guide To Task Analysis: The Task Analysis Working Group , 1992 .

[10]  Glenn A. Osga,et al.  Designing Displays for Command and Control Supervision: Contextualizing Alerts and “Trust-But-Verity” Automation , 2000 .

[11]  Rapid Thermal Multiprocessor,et al.  Supervisory Control of a , 1993 .

[12]  John D. Lee,et al.  Effects of Multiple Auditory Alerts for in-Vehicle Information Systems on Driver Attitudes and Performance , 2001 .

[13]  A. Shepherd,et al.  Guide to Task Analysis , 2003 .

[14]  Robert A. Willis Effect of Display Design and Situation Complexity on Operator Performance , 2001 .

[15]  Raja Parasuraman,et al.  Humans and Automation: Use, Misuse, Disuse, Abuse , 1997, Hum. Factors.