Stages and Levels of Automation in Support of Space Teleoperations

Objective: This study examined the impact of stage of automation on the performance and perceived workload during simulated robotic arm control tasks in routine and off-nominal scenarios. Background: Automation varies with respect to the stage of information processing it supports and its assigned level of automation. Making appropriate choices in terms of stages and levels of automation is critical to ensure robust joint system performance. To date, this issue has been empirically studied in domains such as aviation and medicine but not extensively in the context of space operations. Method: A total of 36 participants played the role of a payload specialist and controlled a simulated robotic arm. Participants performed fly-to tasks with two types of automation (camera recommendation and trajectory control automation) of varying stage. Tasks were performed during routine scenarios and in scenarios in which either the trajectory control automation or a hazard avoidance automation failed. Results: Increasing the stage of automation progressively improved performance and lowered workload when the automation was reliable, but incurred severe performance costs when the system failed. Conclusion: The results from this study support concerns about automation-induced complacency and automation bias when later stages of automation are introduced. The benefits of such automation are offset by the risk of catastrophic outcomes when system failures go unnoticed or become difficult to recover from. Application: A medium stage of automation seems preferable as it provides sufficient support during routine operations and helps avoid potentially catastrophic outcomes in circumstances when the automation fails.

[1]  Lisanne Bainbridge,et al.  Ironies of automation , 1982, Autom..

[2]  Charles E. Billings,et al.  Aviation Automation: The Search for A Human-centered Approach , 1996 .

[3]  Raja Parasuraman,et al.  Humans and Automation: Use, Misuse, Disuse, Abuse , 1997, Hum. Factors.

[4]  Brian F. Gore,et al.  Modeling Operator Performance and Cognition in Robotic Missions , 2011 .

[5]  Christopher D. Wickens,et al.  Stages and Levels of Automation: An Integrated Meta-analysis , 2010 .

[6]  Christopher D. Wickens,et al.  The Design and Evaluation of Visual and Tactile Warnings in Support of Space Teleoperation , 2012 .

[7]  M R Endsley,et al.  Level of automation effects on performance, situation awareness and workload in a dynamic control task. , 1999, Ergonomics.

[8]  Christopher D. Wickens,et al.  A model for types and levels of human interaction with automation , 2000, IEEE Trans. Syst. Man Cybern. Part A.

[9]  Thomas B. Sheridan,et al.  Human and Computer Control of Undersea Teleoperators , 1978 .

[10]  Fredrik Rehnmark,et al.  An Effective Division of Labor Between Human and Robotic Agents Performing a Cooperative Assembly Task , 2003 .

[11]  Nadine B. Sarter,et al.  Team Play with a Powerful and Independent Agent: A Full-Mission Simulation Study , 2000, Hum. Factors.

[12]  J. G. Hollands,et al.  Engineering Psychology and Human Performance , 1984 .

[13]  David B. Kaber,et al.  Design of Automation for Telerobots and the Effect on Performance, Operator Situation Awareness, and Subjective Workload , 2000 .

[14]  Fred George SYNTHETIC VISION SYSTEMS , 2001 .

[15]  Christopher D. Wickens,et al.  Humans: Still Vital After All These Years of Automation , 2008, Hum. Factors.

[16]  Christopher D. Wickens,et al.  Left. No, Right! Development of the Frame of Reference Transformation Tool (FORT) , 2010 .

[17]  Thomas K. Ferris,et al.  Cockpit Automation: Still Struggling to Catch Up… , 2010 .

[18]  Dietrich Manzey,et al.  Human Performance Consequences of Automated Decision Aids in States of Sleep Loss , 2011, Hum. Factors.

[19]  Raja Parasuraman,et al.  Complacency and Bias in Human Use of Automation: An Attentional Integration , 2010, Hum. Factors.

[20]  Raja Parasuraman,et al.  Performance Consequences of Automation-Induced 'Complacency' , 1993 .

[21]  N. Sarter,et al.  Supporting decision-making and action selection under time pressure and uncertainty: The case of inflight icing , 2001 .

[22]  Nadine B. Sarter,et al.  How in the World Did We Ever Get into That Mode? Mode Error and Awareness in Supervisory Control , 1995, Hum. Factors.

[23]  James T. Reason,et al.  Managing the risks of organizational accidents , 1997 .

[24]  Nadine B. Sarter,et al.  Supporting Decision Making and Action Selection under Time Pressure and Uncertainty: The Case of In-Flight Icing , 2001, Hum. Factors.

[25]  Christopher D. Wickens,et al.  Factors Affecting Task Management in Aviation , 2007, Hum. Factors.

[26]  Huiyang Li,et al.  Human Performance Consequences of Stages and Levels of Automation , 2014, Hum. Factors.

[27]  D. Woods,et al.  Automation Surprises , 2001 .

[28]  George Mason Situation Awareness, Mental Workload, and Trust in Automation:Viable, Empirically Supported Cognitive Engineering Constructs , 2011 .

[29]  Raja Parasuraman,et al.  Automation in Future Air Traffic Management: Effects of Decision Aid Reliability on Controller Performance and Mental Workload , 2005, Hum. Factors.

[30]  Raja Parasuraman,et al.  Consequences of shifting one level of automation to another: main effects and their stability , 2005 .