Continuous Calibration of Trust in Automated Systems

Abstract : This report details three studies that have been conducted in order to explore user calibration of trust in automation. In the first, we discover that all-or-none thinking about automation reliability was associated with severe decreases in trust following an aid error, but high expectations for automation performance were not. In the second study, we examine predictors and outcomes of calibration of trust. We measured calibration in three different ways. We found that awareness of the aid's accuracy trajectory (whether it was getting more or less reliable over time) was a significant predictor of calibration. However, we found that none of the three measurements of calibration had strong associations with task performance or the ability to identify aid errors. We also describe the conceptual premise and design of our third and final study. This study examines the development, loss, and recovery of trust in a route planning aid in a military simulation context. The results of this study will be presented in our final report.

[1]  R. Hastie,et al.  Person memory: Personality traits as organizing principles in memory for behaviors. , 1979 .

[2]  J. G. Hollands,et al.  Engineering Psychology and Human Performance , 1984 .

[3]  T. K. Srull,et al.  Person memory and judgment. , 1989, Psychological review.

[4]  D. Gilbert,et al.  The trouble of thinking: Activation and application of stereotypic beliefs. , 1991 .

[5]  C. Stangor,et al.  Memory for expectancy-congruent and expectancy-incongruent information: A review of the social and social developmental literatures. , 1992 .

[6]  C. N. Macrae,et al.  Out of mind but back in sight: Stereotypes on the rebound , 1994 .

[7]  C. Stangor,et al.  The role of memory biases in stereotype maintenance. , 1994, The British journal of social psychology.

[8]  M. Banaji,et al.  Implicit social cognition: attitudes, self-esteem, and stereotypes. , 1995, Psychological review.

[9]  R. Vandenberg,et al.  A Review and Synthesis of the Measurement Invariance Literature: Suggestions, Practices, and Recommendations for Organizational Research , 2000 .

[10]  N. Moray,et al.  Adaptive automation, trust, and self-confidence in fault management of time-critical tasks. , 2000, Journal of experimental psychology. Applied.

[11]  C. Macrae,et al.  Social cognition: thinking categorically about others. , 2000, Annual review of psychology.

[12]  Joachim Meyer,et al.  Effects of Warning Validity and Proximity on Responses to Warnings , 2001, Hum. Factors.

[13]  A. M. Rich,et al.  Automated diagnostic aids: The effects of aid reliability on users' trust and reliance , 2001 .

[14]  Linda G. Pierce,et al.  The Perceived Utility of Human and Automated Aids in a Visual Detection Task , 2002, Hum. Factors.

[15]  Cees J. H. Midden,et al.  The effects of errors on system trust, self-confidence, and the allocation of control in route planning , 2003, Int. J. Hum. Comput. Stud..

[16]  Joachim Meyer,et al.  Conceptual Issues in the Study of Dynamic Hazard Warnings , 2004, Hum. Factors.

[17]  Christopher A. Miller,et al.  Trust and etiquette in high-criticality automated systems , 2004, CACM.

[18]  John D. Lee,et al.  Trust in Automation: Designing for Appropriate Reliance , 2004 .

[19]  Christopher D. Wickens,et al.  Unmanned Aerial Vehicle Flight Control: False Alarms versus Misses , 2004 .

[20]  Harvey S. Smallman,et al.  Heuristic Automation for Decluttering Tactical Displays , 2005, Hum. Factors.

[21]  Brian A. Nosek,et al.  Understanding and Using the Implicit Association Test: II. Method Variables and Construct Validity , 2005, Personality & social psychology bulletin.

[22]  Douglas A. Wiegmann,et al.  Automation Failures on Tasks Easily Performed by Operators Undermines Trust in Automated Aids , 2003 .

[23]  Christopher D. Wickens,et al.  On the Independence of Compliance and Reliance: Are Automation False Alarms Worse Than Misses? , 2007, Hum. Factors.

[24]  Raja Parasuraman,et al.  Effects of Imperfect Automation on Decision Making in a Simulated Command and Control Task , 2007, Hum. Factors.

[25]  Linda G. Pierce,et al.  Automation Usage Decisions: Controlling Intent and Appraisal Errors in a Target Detection Task , 2007, Hum. Factors.

[26]  Douglas A. Wiegmann,et al.  Effects of Information Source, Pedigree, and Reliability on Operator Interaction With Decision Support Systems , 2007, Hum. Factors.

[27]  Christopher D. Wickens,et al.  Humans: Still Vital After All These Years of Automation , 2008, Hum. Factors.

[28]  Daniel R. Ilgen,et al.  Not All Trust Is Created Equal: Dispositional and History-Based Trust in Human-Automation Interactions , 2008, Hum. Factors.

[29]  Stephen Rice Examining Single- and Multiple-Process Theories of Trust in Automation , 2009, The Journal of general psychology.

[30]  Lu Wang,et al.  Trust and Reliance on an Automated Combat Identification System , 2009, Hum. Factors.

[31]  Stephanie M. Merritt Affective Processes in Human–Automation Interactions , 2011, Hum. Factors.

[32]  Deborah Lee,et al.  I Trust It, but I Don’t Know Why , 2013, Hum. Factors.