Modeling patterns of breakdown (or archetypes) of human and organizational processes in accidents using system dynamics

Systems approaches to safety have received growing attention in modern accident investigation techniques (e.g., STAMP, Accimap) with the emphasis shifted to the organizational dynamics (or archetypes) that may lead to an erosion of defenses and a drift out of the safety margins. Although the literature contains many applications of archetypes and system dynamics to safety, this richness comes at a cost of learning. It has become very difficult for safety practitioners to integrate the diverse studies of system dynamics with their diverging models. To provide a practical tool of system dynamics in accident investigation, this article reviews earlier studies and integrates them as a classification of patterns of breakdown (or archetypes) of both human and organizational processes on the basis of two control models, that is, the Extended Control Model (ECOM) and the Viable System Model (VSM). In this article, archetypes are represented as variants of two generic templates of performance which exploit many elements of complexity theory and system control. Apart from providing a practical tool to safety practitioners to access the literature on archetypes, the generic templates of ECOM and VSM can be used in building simulators of individual and organizational processes for risk analysis.

[1]  James T. Reason,et al.  Tripod Delta: Proactive Approach to Enhanced Safety , 1994 .

[2]  Gary Klein,et al.  Sources of Power: How People Make Decisions , 2017 .

[3]  Colin F. Mackenzie,et al.  Negotiation and conflict in large scale collaboration: a preliminary field study , 2006, Cognition, Technology & Work.

[4]  J.H. Saleh,et al.  Conceptualizing and communicating organizational risk dynamics in the thoroughness-efficiency space , 2008, Reliab. Eng. Syst. Saf..

[5]  Nancy G. Leveson,et al.  Using system dynamics for safety and risk management in complex engineering systems , 2005, Proceedings of the Winter Simulation Conference, 2005..

[6]  S. Rahman Reliability Engineering and System Safety , 2011 .

[7]  David L. Cooke,et al.  A system dynamics analysis of the Westray mine disaster , 2003 .

[8]  John D. Sterman,et al.  System Dynamics: Systems Thinking and Modeling for a Complex World , 2002 .

[9]  J. Gray Learning from Bristol. , 2001, Nursing standard (Royal College of Nursing (Great Britain) : 1987).

[10]  Erik Hollnagel,et al.  Joint Cognitive Systems: Patterns in Cognitive Systems Engineering , 2006 .

[11]  Jenny W. Rudolph,et al.  The Dynamics of Action-Oriented Problem Solving: Linking Interpretation and Choice , 2009 .

[12]  Thomas R. Rohleder,et al.  Learning from incidents: from normal accidents to high reliability , 2006 .

[13]  James G. Anderson,et al.  The need for organizational change in patient safety initiatives , 2006, Int. J. Medical Informatics.

[14]  Donald A. Schön,et al.  Organizational Learning II: Theory, Method, and Practice , 1995 .

[15]  Andrew Hale,et al.  I-Risk: development of an integrated technical and management risk methodology for chemical installations , 2003 .

[16]  Zahra Mohaghegh,et al.  Incorporating organizational factors into Probabilistic Risk Assessment (PRA) of complex socio-technical systems: A hybrid technique formalization , 2009, Reliab. Eng. Syst. Saf..

[17]  Mica R. Endsley,et al.  Toward a Theory of Situation Awareness in Dynamic Systems , 1995, Hum. Factors.

[18]  Nadine B. Sarter,et al.  How in the World Did We Ever Get into That Mode? Mode Error and Awareness in Supervisory Control , 1995, Hum. Factors.

[19]  Anita L. Tucker,et al.  Why Hospitals Don't Learn from Failures: Organizational and Psychological Dynamics That Inhibit System Change , 2003 .

[20]  Simon Reynolds,et al.  Learning from Disasters: A Management Approach , 1997 .

[21]  James T. Reason,et al.  Managing the risks of organizational accidents , 1997 .

[22]  Erik Hollnagel,et al.  Joint Cognitive Systems: Foundations of Cognitive Systems Engineering , 2005 .

[23]  Moosung Jae,et al.  A Quantitative Assessment of Organizational Factors Affecting Safety Using System Dynamics Model , 2004 .

[24]  I. Svedung,et al.  Proactive Risk Management in a Dynamic Society , 2000 .

[25]  Jeremy Busby,et al.  Failure to Mobilize in Reliability-Seeking Organizations: Two Cases from the UK Railway , 2006 .

[26]  B. Turner Man Made Disasters , 1995 .

[27]  Joseph H. Saleh,et al.  Archetypes for organizational safety , 2006 .

[28]  Jacques Leplat,et al.  Occupational accident research and systems approach , 1984 .

[29]  Nancy G. Leveson,et al.  A new accident model for engineering safer systems , 2004 .

[30]  Trevor Kletz,et al.  Learning from Accidents , 2001 .

[31]  D. Dörner The logic of failure. , 1990, Philosophical transactions of the Royal Society of London. Series B, Biological sciences.

[32]  Peter Checkland,et al.  Diagnosing the system for organizations: S. BEER Wiley, Chichester, 1985, 152 + xiii pages, £7.50 , 1986 .

[33]  Marlys K. Christianson,et al.  Managing the unexpected , 2001 .

[34]  Sidney Dekker,et al.  The Field Guide to Understanding 'Human Error' , 2014 .

[35]  Tom Kontogiannis,et al.  A systemic analysis of patterns of organizational breakdowns in accidents: A case from Helicopter Emergency Medical Service (HEMS) operations , 2012, Reliab. Eng. Syst. Saf..

[36]  Catherine M. Burns,et al.  There Is More to Monitoring a Nuclear Power Plant than Meets the Eye , 2000, Hum. Factors.

[37]  Markus Salge,et al.  Who is to blame, the operator or the designer? Two stages of human failure in the Chernobyl accident , 2006 .

[38]  B. Brehmer Dynamic decision making: human control of complex systems. , 1992, Acta psychologica.