Supporting Trust Calibration and the Effective Use of Decision Aids by Presenting Dynamic System Confidence Information

Objective: To examine whether continually updated information about a system's confidence in its ability to perform assigned tasks improves operators' trust calibration in, and use of, an automated decision support system (DSS). Background: The introduction of decision aids often leads to performance breakdowns that are related to automation bias and trust miscalibration. This can be explained, in part, by the fact that operators are informed about overall system reliability only, which makes it impossible for them to decide on a case-by-case basis whether to follow the system's advice. Method: The application for this research was a neural net-based decision aid that assists pilots with detecting and handling in-flight icing encounters. A multifactorial experiment was carried out with two groups of 15 instructor pilots each flying a series of 28 approaches in a motion-base simulator. One group was informed about the system's overall reliability only, whereas the other group received updated system confidence information. Results: Pilots in the updated group experienced significantly fewer icing-related stalls and were more likely to reverse their initial response to an icing condition when it did not produce desired results. Their estimate of the system's accuracy was more accurate than that of the fixed group. Conclusion: The presentation of continually updated system confidence information can improve trust calibration and thus lead to better performance of the human-machine team. Application: The findings from this research can inform the design of decision support systems in a variety of event-driven high-tempo domains.

[1]  John D. Lee,et al.  Trust, self-confidence, and operators' adaptation to automation , 1994, Int. J. Hum. Comput. Stud..

[2]  L. P. Goodstein,et al.  Decision Support in Supervisory Control , 1985 .

[3]  Erik Hollnagel Information and Reasoning in Intelligent Decision Support Systems , 1987, Int. J. Man Mach. Stud..

[4]  Jean E. Fox,et al.  The effects of information accuracy on user trust and compliance , 1996, CHI Conference Companion.

[5]  David D. Woods,et al.  Commentary: Cognitive Engineering in Complex and Dynamic Worlds , 1988, Int. J. Man Mach. Stud..

[6]  Nadine B. Sarter,et al.  Supporting Decision Making and Action Selection under Time Pressure and Uncertainty: The Case of In-Flight Icing , 2001, Hum. Factors.

[7]  N Moray,et al.  Trust, control strategies and allocation of function in human-machine systems. , 1992, Ergonomics.

[8]  E. M. Roth,et al.  Chapter 1 – Cognitive Systems Engineering , 1988 .

[9]  K. Mosier,et al.  Human Decision Makers and Automated Decision Aids: Made for Each Other? , 1996 .

[10]  A. Agresti An introduction to categorical data analysis , 1997 .

[11]  Raja Parasuraman,et al.  Humans and Automation: Use, Misuse, Disuse, Abuse , 1997, Hum. Factors.

[12]  Richard J. Hanowski,et al.  Driver Acceptance of Unreliable Traffic Information in Familiar and Unfamiliar Settings , 1997, Hum. Factors.

[13]  Jens Rasmussen,et al.  Cognitive Systems Engineering , 2022 .

[14]  Barry H. Kantowitz,et al.  Likelihood Alarm Displays , 1988 .

[15]  William R. Perkins,et al.  An Interdisciplinary Approach to Inflight Aircraft Icing Safety , 1998 .

[16]  Mustapha Mouloua,et al.  Automation and Human Performance : Theory and Applications , 1996 .

[17]  T. Landauer,et al.  Handbook of Human-Computer Interaction , 1997 .

[18]  Dennis E. Egan,et al.  Handbook of Human Computer Interaction , 1988 .

[19]  Charles E. Billings,et al.  Aviation Automation: The Search for A Human-centered Approach , 1996 .

[20]  Bonnie M. Muir,et al.  Trust Between Humans and Machines, and the Design of Decision Aids , 1987, Int. J. Man Mach. Stud..