Assessing and Treating Risks in Mechanised NDT: A Human Factors Study

Reliability of NDT is affected by human factors, which have thus far received the least amount of attention in the reliability assessments. With increased use of automation, in terms of mechanised testing (automation-assisted inspection and the corresponding evaluation of data), higher reliability standards are believed to have been achieved. However, human inspectors, and thus human factors, still play an important role throughout this process and the risks involved in this application are unknown. The aim of this study was to explore for the first time the risks associated with mechanised NDT and find ways of mitigating their effects on the inspection performance. Hence, the objectives were to identify and Analyse potential risks in mechanised NDT and devise measures against them. To address those objectives, a risk assessment in form of a Failure Modes and Effects Analysis (FMEA) was conducted. This analysis revealed potential for failure during both the acquisition and evaluation of NDT data that could be assigned to human, technology, and organisation. Since the existing preventive measures were judged to be insufficient to defend the system from identified failures, new preventive measures were suggested.

[1]  Raja Parasuraman,et al.  Humans and Automation: Use, Misuse, Disuse, Abuse , 1997, Hum. Factors.

[2]  David M. Clarke,et al.  Human redundancy in complex, hazardous systems: A theoretical framework , 2005 .

[3]  Lorenzo Strigini,et al.  Effects of incorrect computer-aided detection (CAD) output on human decision-making in mammography. , 2004, Academic radiology.

[4]  M. Branch,et al.  ANNUAL REPORT FOR 2014 , 2014 .

[5]  Douglas A. Wiegmann,et al.  Automation Failures on Tasks Easily Performed by Operators Undermines Trust in Automated Aids , 2003 .

[6]  Kathleen L. Mosier,et al.  Does automation bias decision-making? , 1999, Int. J. Hum. Comput. Stud..

[7]  Kipling D. Williams,et al.  PROCESSES Social Loafing: A Meta-Analytic Review and Theoretical Integration , 2022 .

[8]  Balbir S. Dhillon,et al.  Design Reliability: Fundamentals and Applications , 1999 .

[9]  Thomas B. Sheridan,et al.  Risk, Human Error, and System Resilience: Fundamental Ideas , 2008, Hum. Factors.

[10]  Prasad V. Prabhu,et al.  A review of human error in aviation maintenance and inspection , 2000 .

[11]  Stephen N. Luko,et al.  Risk Management Principles and Guidelines , 2013 .

[12]  Simon Bennett Managing Maintenance Error: A Practical Guide , 2004 .

[13]  Sidney Dekker,et al.  The Field Guide to Human Error Investigations , 2006 .

[14]  Marija Bertovic,et al.  Human factors in non-destructive testing (NDT): risks and challenges of mechanised NDT , 2015 .

[15]  F. Fantini,et al.  Failure Analysis-assisted FMEA , 2006, Microelectron. Reliab..

[16]  Erik Hollnagel,et al.  The Phenotype of Erroneous Actions , 1993, Int. J. Man Mach. Stud..

[17]  Juliane Marold Sehen vier Augen mehr als zwei? Der Einfluss personaler Redundanz auf die Leistung bei der Überwachung automatisierter Systeme , 2012 .

[18]  Jens Rasmussen,et al.  What can be Learned from Human Error Reports , 1980 .

[19]  Daniel Kanzler,et al.  NDT Reliability in the Organizational Context of Service Inspection Companies , 2014 .

[20]  James T. Reason,et al.  A systems approach to organizational error , 1995 .

[21]  J. Reason Human error: models and management , 2000, BMJ : British Medical Journal.

[22]  Dietrich Manzey,et al.  Human Redundancy as Safety Measure in Automation Monitoring , 2013 .

[23]  D. Woods,et al.  Automation Surprises , 2001 .

[24]  A. D. Swain,et al.  Handbook of human-reliability analysis with emphasis on nuclear power plant applications. Final report , 1983 .

[25]  Dietrich Manzey,et al.  Misuse of automated decision aids: Complacency, automation bias and the impact of training experience , 2008, Int. J. Hum. Comput. Stud..

[26]  E. Hollnagel,et al.  The Changing Nature Of Risks , 2010 .

[27]  Raja Parasuraman,et al.  Complacency and Bias in Human Use of Automation: An Attentional Integration , 2010, Hum. Factors.

[28]  Jens Rasmussen,et al.  Risk management in a dynamic society: a modelling problem , 1997 .

[29]  L. Bainbridge Ironies of Automation , 1982 .

[30]  D. Manzey,et al.  Systemgestaltung und Automatisierung , 2012 .

[31]  Christina Müller,et al.  Untersuchungen zum Einfluss menschlicher Faktoren auf das Ergebnis von zerstörungsfreien Prüfungen, Möglichkeiten zur Minimierung dieses Einflusses und Bewertung der Prüfergebnisse , 2009 .

[32]  James T. Reason,et al.  Managing the risks of organizational accidents , 1997 .

[33]  Daniel Kanzler,et al.  Paradigm Shift in the Holistic Evaluation of the Reliability of NDE Systems , 2013 .

[34]  Erik Hollnagel,et al.  Risk + barriers = safety? , 2008 .

[35]  Scott D. Sagan,et al.  The Problem of Redundancy Problem: Why More Nuclear Security Forces May Produce Less Nuclear Security † , 2004, Risk analysis : an official publication of the Society for Risk Analysis.