Automation bias and verification complexity: a systematic review

Introduction: While potentially reducing decision errors, decision support systems can introduce new types of errors. Automation bias (AB) happens when users become overreliant on decision support, which reduces vigilance in information seeking and processing. Most research originates from the human factors literature, where the prevailing view is that AB occurs only in multitasking environments. Objectives: This review seeks to compare the human factors and health care literature, focusing on the apparent association of AB with multitasking and task complexity. Data sources: EMBASE, Medline, Compendex, Inspec, IEEE Xplore, Scopus, Web of Science, PsycINFO, and Business Source Premiere from 1983 to 2015. Study selection: Evaluation studies where task execution was assisted by automation and resulted in errors were included. Participants needed to be able to verify automation correctness and perform the task manually. Methods: Tasks were identified and grouped. Task and automation type and presence of multitasking were noted. Each task was rated for its verification complexity. Results: Of 890 papers identified, 40 met the inclusion criteria; 6 were in health care. Contrary to the prevailing human factors view, AB was found in single tasks, typically involving diagnosis rather than monitoring, and with high verification complexity. Limitations: The literature is fragmented, with large discrepancies in how AB is reported. Few studies reported the statistical significance of AB compared to a control condition. Conclusion: AB appears to be associated with the degree of cognitive load experienced in decision tasks, and appears to not be uniquely associated with multitasking. Strategies to minimize AB might focus on cognitive load reduction.

[1]  Wei Wu,et al.  The Effect of Computerized Physician Order Entry with Clinical Decision Support on the Rates of Adverse Drug Events: A Systematic Review , 2008, Journal of General Internal Medicine.

[2]  D. Moher,et al.  Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement , 2009, BMJ.

[3]  E. G. Lyman,et al.  NASA aviation safety reporting system , 1976 .

[4]  H. Mcdonald,et al.  Effects of computerized clinical decision support systems on practitioner performance and patient outcomes: a systematic review. , 2005, JAMA.

[5]  Christopher D. Wickens,et al.  The benefits of imperfect diagnostic automation: a synthesis of the literature , 2007 .

[6]  Enrico Coiera,et al.  Technology, cognition and error , 2015, BMJ Quality & Safety.

[7]  Dietrich Manzey,et al.  Misuse of Diagnostic Aids in Process Control: The Effects of Automation Misses on Complacency and Automation Bias , 2008 .

[8]  Abdul V. Roudsari,et al.  Automation bias: a systematic review of frequency, effect mediators, and mitigators , 2012, J. Am. Medical Informatics Assoc..

[9]  E. Coiera The science of interruption , 2012, BMJ quality & safety.

[10]  Kate Goddard,et al.  Automation bias and prescribing decision support – rates, mediators and mitigators , 2012 .

[11]  Phillip N. Goernert,et al.  The Effect of Levels of Automation on Supervisory Performance in a Multi-Task Environment , 1998 .

[12]  Lorenzo Strigini,et al.  How to Discriminate between Computer-Aided and Computer-Hindered Decisions , 2013, Medical decision making : an international journal of the Society for Medical Decision Making.

[13]  Babette Fahlbruch,et al.  Human Factors Perspective on the Reliability of NDT in Nuclear Applications , 2013 .

[14]  Raja Parasuraman,et al.  Monitoring Automation Failures: Effects of Automation Reliability and Task Complexity , 1992 .

[15]  J. Sterne,et al.  The Cochrane Collaboration’s tool for assessing risk of bias in randomised trials , 2011, BMJ : British Medical Journal.

[16]  Jason S McCarley,et al.  Effects of response bias and judgment framing on operator use of an automated aid in a target detection task. , 2011, Journal of experimental psychology. Applied.

[17]  Donald P. Connelly,et al.  The Effects of Computer-assisted Electrocardiographic Interpretation on Physicians' Diagnostic Decisions , 1995, Medical decision making : an international journal of the Society for Medical Decision Making.

[18]  Vitali Sintchenko,et al.  Which clinical decisions benefit from automation? A task complexity approach , 2003, Int. J. Medical Informatics.

[19]  John Sweller,et al.  Cognitive Load Theory , 2020, Encyclopedia of Education and Information Technologies.

[20]  Raja Parasuraman,et al.  Performance Consequences of Automation-Induced 'Complacency' , 1993 .

[21]  L J Skitka,et al.  Automation bias: decision making and performance in high-tech cockpits. , 1997, The International journal of aviation psychology.

[22]  Christopher D Wickens,et al.  Effects of conflict alerting system reliability and task difficulty on pilots' conflict detection with cockpit display of traffic information , 2007, Ergonomics.

[23]  Kathleen L. Mosier,et al.  Aircrews and Automation Bias: The Advantages of Teamwork? , 2001 .

[24]  Raja Parasuraman,et al.  Automation in Future Air Traffic Management: Effects of Decision Aid Reliability on Controller Performance and Mental Workload , 2005, Hum. Factors.

[25]  Geoffrey W. McCarthy Human Error in Aviation , 2012 .

[26]  E Coiera,et al.  Section 1: Health and Clinical Mangement: The Safety and Quality of Decision Support Systems , 2006, Yearbook of Medical Informatics.

[27]  Kathleen L. Mosier,et al.  Does automation bias decision-making? , 1999, Int. J. Hum. Comput. Stud..

[28]  Nadine B. Sarter,et al.  Supporting Decision Making and Action Selection under Time Pressure and Uncertainty: The Case of In-Flight Icing , 2001, Hum. Factors.

[29]  Neville Moray,et al.  The Effect of Different Styles of Human-Machine Interaction on the Nature of Operator Mental Models , 2000 .

[30]  Abdul V. Roudsari,et al.  Automation bias: Empirical results assessing influencing factors , 2014, Int. J. Medical Informatics.

[31]  Christopher D. Wickens,et al.  Automation Reliability in Unmanned Aerial Vehicle Control: A Reliance-Compliance Model of Automation Dependence in High Workload , 2006, Hum. Factors.

[32]  Abdul V. Roudsari,et al.  Study of the Effects of Clinical Decision Support System's Incorrect Advice and Clinical Case Difficulty on Users' Decision Making Accuracy , 2011, ITCH.

[33]  Lorenzo Strigini,et al.  Effects of incorrect computer-aided detection (CAD) output on human decision-making in mammography. , 2004, Academic radiology.

[34]  Kathleen L. Mosier,et al.  Accountability and automation bias , 2000, Int. J. Hum. Comput. Stud..

[35]  Karel Hurts,et al.  The Duration of Automation Bias in a Realistic Setting , 2014 .

[36]  Raja Parasuraman,et al.  Effects of Training and Automation Reliability on Monitoring Performance in a Flight Simulation Task , 2000 .

[37]  Raja Parasuraman,et al.  Effects of Imperfect Automation on Decision Making in a Simulated Command and Control Task , 2007, Hum. Factors.

[38]  G. Jamieson,et al.  CONSIDERING SUBJECTIVE TRUST AND MONITORING BEHAVIOR IN ASSESSING AUTOMATION-INDUCED “COMPLACENCY” , 2004 .

[39]  Greg A. Jamieson,et al.  The impact of context-related reliability on automation failure detection and scanning behaviour , 2004, 2004 IEEE International Conference on Systems, Man and Cybernetics (IEEE Cat. No.04CH37583).

[40]  Douglas B. Fridsma,et al.  Research Paper: Computer Decision Support as a Source of Interpretation Error: The Case of Electrocardiograms , 2003, J. Am. Medical Informatics Assoc..

[41]  K. Mosier,et al.  Human Decision Makers and Automated Decision Aids: Made for Each Other? , 1996 .

[42]  Nadine B. Sarter,et al.  Management by Consent in Human-Machine Systems: When and Why It Breaks Down , 2001, Hum. Factors.

[43]  Mark W. Scerbo,et al.  Automation-induced complacency for monitoring highly reliable systems: the role of task complexity, system experience, and operator trust , 2007 .

[44]  Frank Bogun,et al.  Misdiagnosis of atrial fibrillation and its clinical consequences. , 2004, The American journal of medicine.

[45]  Dietrich Manzey,et al.  Human Performance Consequences of Automated Decision Aids in States of Sleep Loss , 2011, Hum. Factors.

[46]  Huiyang Li,et al.  Stages and Levels of Automation in Support of Space Teleoperations , 2014, Hum. Factors.

[47]  Raja Parasuraman,et al.  Dopamine Beta Hydroxylase Genotype Identifies Individuals Less Susceptible to Bias in Computer-Assisted Decision Making , 2012, PloS one.

[48]  Mark W. Scerbo,et al.  Comparison of a Brain-Based Adaptive System and a Manual Adaptable System for Invoking Automation , 2006, Hum. Factors.

[49]  Dietrich Manzey,et al.  Misuse of automated decision aids: Complacency, automation bias and the impact of training experience , 2008, Int. J. Hum. Comput. Stud..

[50]  Raja Parasuraman,et al.  Complacency and Bias in Human Use of Automation: An Attentional Integration , 2010, Hum. Factors.

[51]  Ansgar Malich,et al.  Are unnecessary follow-up procedures induced by computer-aided diagnosis (CAD) in mammography? Comparison of mammographic diagnosis with and without use of CAD. , 2004, European journal of radiology.

[52]  Linda Onnasch,et al.  Misuse of Automation: The Impact of System Experience on Complacency and Automation Bias in Interaction with Automated Aids , 2010 .

[53]  Nadine B. Sarter,et al.  Supporting Trust Calibration and the Effective Use of Decision Aids by Presenting Dynamic System Confidence Information , 2006, Hum. Factors.

[54]  Linda Onnasch,et al.  Human Performance Consequences of Automated Decision Aids , 2012 .

[55]  William N Southern,et al.  The Effect of Erroneous Computer Interpretation of ECGs on Resident Decision Making , 2009, Medical decision making : an international journal of the Society for Medical Decision Making.

[56]  L J Skitka,et al.  Automation Bias and Errors: Are Crews Better Than Individuals? , 2000, The International journal of aviation psychology.

[57]  Indramani L. Singh,et al.  Monitoring Performance and Mental Workload in an Automated System , 2007, HCI.