Absence of DOA Effect but No Proper Test of the Lumberjack Effect: A Reply to Jamieson and Skraaning (2019)

Objective The aim was to evaluate the relevance of the critique offered by Jamieson and Skraaning (2019) regarding the applicability of the lumberjack effect of human–automation interaction to complex real-world settings. Background The lumberjack effect, based upon a meta-analysis, identifies the consequences of a higher degree of automation—to improve performance and reduce workload—when automation functions as intended, but to degrade performance more, as mediated by a loss of situation awareness (SA) when automation fails. Jamieson and Skraaning provide data from a process control scenario that they assert contradicts the effect. Approach We analyzed key aspects of their simulation, measures, and results which we argue limit the strength of their conclusion that the lumberjack effect is not applicable to complex real-world systems. Results Our analysis revealed limits in their inappropriate choice of automation, the lack of a routine performance measure, support for the lumberjack effect that was actually provided by subjective measures of the operators, an inappropriate assessment of SA, and a possible limitation of statistical power. Conclusion We regard these limitations as reasons to temper the strong conclusions drawn by the authors, of no applicability of the lumberjack effect to complex environments. Their findings should be used as an impetus for conducting further research on human–automation interaction in these domains. Applications The collective findings of both Jamieson and Skraaning and our study are applicable to system designers and users in deciding upon the appropriate level of automation to deploy.

[1]  Thomas B. Sheridan,et al.  Human and Computer Control of Undersea Teleoperators , 1978 .

[2]  Dietrich Manzey,et al.  Automation in Surgery: The Impact of Navigated-Control Assistance on the Performance, Workload and Situation Awareness of Surgeons , 2010 .

[3]  Christopher D. Wickens,et al.  A model for types and levels of human interaction with automation , 2000, IEEE Trans. Syst. Man Cybern. Part A.

[4]  Greg A. Jamieson,et al.  Levels of Automation in Human Factors Models for Automation Design: Why We Might Consider Throwing the Baby Out With the Bathwater , 2018 .

[5]  Mica R. Endsley,et al.  The Out-of-the-Loop Performance Problem and Level of Control in Automation , 1995, Hum. Factors.

[6]  David B. Kaber,et al.  Issues in Human–Automation Interaction Modeling: Presumptive Aspects of Frameworks of Types and Levels of Automation , 2018 .

[7]  Mica R. Endsley,et al.  Direct Measurement of Situation Awareness: Validity and Use of SAGAT , 2000 .

[8]  Heath A. Ruff,et al.  Effect of Level of Automation on Unmanned Aerial Vehicle Routing Task , 2009 .

[9]  Huiyang Li,et al.  Human Performance Consequences of Stages and Levels of Automation , 2014, Hum. Factors.

[10]  Greg A Jamieson,et al.  The Absence of Degree of Automation Trade-Offs in Complex Work Settings , 2020, Hum. Factors.

[11]  Christopher D. Wickens,et al.  Complacency and Automation Bias in the Use of Imperfect Automation , 2015, Hum. Factors.

[12]  Christopher D. Wickens,et al.  Implementing Lumberjacks and Black Swans Into Model-Based Tools to Support Human–Automation Interaction , 2017, Hum. Factors.

[13]  Mica R. Endsley,et al.  Measurement of Situation Awareness in Dynamic Systems , 1995, Hum. Factors.