Using Modeling and Simulation to Predict Operator Performance and Automation-Induced Complacency With Robotic Automation

Objective: The aim of this study was to develop and validate a computational model of the automation complacency effect, as operators work on a robotic arm task, supported by three different degrees of automation. Background: Some computational models of complacency in human–automation interaction exist, but those are formed and validated within the context of fairly simplified monitoring failures. This research extends model validation to a much more complex task, so that system designers can establish, without need for human-in-the-loop (HITL) experimentation, merits and shortcomings of different automation degrees. Method: We developed a realistic simulation of a space-based robotic arm task that could be carried out with three different levels of trajectory visualization and execution automation support. Using this simulation, we performed HITL testing. Complacency was induced via several trials of correctly performing automation and then was assessed on trials when automation failed. Following a cognitive task analysis of the robotic arm operation, we developed a multicomponent model of the robotic operator and his or her reliance on automation, based in part on visual scanning. Results: The comparison of model predictions with empirical results revealed that the model accurately predicted routine performance and predicted the responses to these failures after complacency developed. However, the scanning models do not account for the entire attention allocation effects of complacency. Applications: Complacency modeling can provide a useful tool for predicting the effects of different types of imperfect automation. The results from this research suggest that focus should be given to supporting situation awareness in automation development.

[1]  John W. Senders,et al.  Visual Sampling Processes , 1984 .

[2]  Raja Parasuraman,et al.  Human-Computer Monitoring , 1987 .

[3]  G. Jamieson,et al.  CONSIDERING SUBJECTIVE TRUST AND MONITORING BEHAVIOR IN ASSESSING AUTOMATION-INDUCED “COMPLACENCY” , 2004 .

[4]  Christopher D Wickens,et al.  Great expectations: top-down attention modulates the costs of clutter and eccentricity. , 2013, Journal of experimental psychology. Applied.

[5]  Toshiyuki Inagaki,et al.  Attention and complacency , 2000 .

[6]  Juliana Goh,et al.  Pilot Dependence on Imperfect Diagnostic Automation in Simulated UAV Flights: An Attentional Visual Scanning Analysis. , 2005 .

[7]  David L. Kleinman,et al.  A Model for Human Controller Remnant , 1969 .

[8]  Brian F. Gore,et al.  Modeling Pilot Situation Awareness , 2011 .

[9]  Brian F. Gore,et al.  Modeling Operator Performance and Cognition in Robotic Missions , 2011 .

[10]  Zhen Wen Visual Momentum , 2008, Encyclopedia of GIS.

[11]  G. R. J. Hockey,et al.  Applied Attention Theory , 2009 .

[12]  Brian F. Gore,et al.  Man–machine Integration Design and Analysis System (MIDAS) v5: Augmentations, Motivations, and Directions for Aeronautics Applications , 2011 .

[13]  Christopher D. Wickens,et al.  Left. No, Right! Development of the Frame of Reference Transformation Tool (FORT) , 2010 .

[14]  S. Lewandowsky,et al.  A connectionist model of complacency and adaptive recovery under automation. , 2000, Journal of experimental psychology. Learning, memory, and cognition.

[15]  Renwick E. Curry,et al.  Flight-deck automation: promises and problems , 1980 .

[16]  Raja Parasuraman,et al.  Performance Consequences of Automation-Induced 'Complacency' , 1993 .

[17]  Thomas B. Sheridan,et al.  On How Often the Supervisor Should Sample , 1970, IEEE Trans. Syst. Sci. Cybern..

[18]  David D. Woods,et al.  Visual Momentum: A Concept to Improve the Cognitive Coupling of Person and Computer , 1984, Int. J. Man Mach. Stud..

[19]  Christopher D. Wickens,et al.  The Cambridge Handbook of Visuospatial Thinking: Design Applications of Visual Spatial Thinking: The Importance of Frame of Reference , 2005 .

[20]  Dietrich Manzey,et al.  Less is sometimes more: a comparison of distance-control and navigated-control concepts of image-guided navigation support for surgeons , 2015, Ergonomics.

[21]  Christopher D. Wickens,et al.  Stages and Levels of Automation: An Integrated Meta-analysis , 2010 .

[22]  C. Wickens The Cambridge Handbook of Applied Perception Research: Noticing Events in the Visual Workplace: The SEEV and NSEEV Models , 2017 .

[23]  Daniel G. Morrow,et al.  Reducing and Mitigating Human Error in Medicine , 2005 .

[24]  George Mason Situation Awareness, Mental Workload, and Trust in Automation:Viable, Empirically Supported Cognitive Engineering Constructs , 2011 .

[25]  Raja Parasuraman,et al.  Automation in Future Air Traffic Management: Effects of Decision Aid Reliability on Controller Performance and Mental Workload , 2005, Hum. Factors.

[26]  Christopher D. Wickens,et al.  Multiple resources and performance prediction , 2002 .

[27]  David B. Kaber,et al.  Design of Automation for Telerobots and the Effect on Performance, Operator Situation Awareness, and Subjective Workload , 2000 .

[28]  Huiyang Li,et al.  Human Performance Consequences of Stages and Levels of Automation , 2014, Hum. Factors.

[29]  Joseph L. Mundy,et al.  Change Detection , 2014, Computer Vision, A Reference Guide.

[30]  Huiyang Li,et al.  Stages and Levels of Automation in Support of Space Teleoperations , 2014, Hum. Factors.

[31]  Raja Parasuraman,et al.  Complacency and Bias in Human Use of Automation: An Attentional Integration , 2010, Hum. Factors.

[32]  Christopher D. Wickens,et al.  A model for types and levels of human interaction with automation , 2000, IEEE Trans. Syst. Man Cybern. Part A.

[33]  Christopher D. Wickens,et al.  Attention-Situation Awareness (A-SA) Model of Pilot Error , 2007 .