Learning from organizational incidents: Resilience engineering for high‐risk process environments

For years, safety improvements have been made by evaluating incident reports and analyzing errors and violations. Current developments in safety science, however, challenge the idea that safety can meaningfully be seen as the absence of errors or other negatives. Instead, the question becomes whether a company is aware of positive ways in which people, at all level of the organization, contribute to the management and containment of the risks it actually faces. The question, too, is whether the organization has the adaptive capacity necessary to respond to the changing nature of risk as operations shift and evolve. This article presents the results of a resilience engineering safety audit conducted on a chemical company site. An interdisciplinary team of seven researchers carried out 4 days of field studies and interviews in several plants on this site. This company enjoyed an almost incident‐free recent history but turned out to be ill‐equiped to handle future risks and many well‐known daily problems. Safety was often borrowed from to meet acute production goals. Organizational learning from incidents was fragmented into small organizational or production units without a company‐wide learning. We conclude that improving safety performance hinges on an organization's dynamic capacity to reflect on and modify its models of risk as operations and insight into them evolve, for example, as they are embodied in safety procedures and policies. © 2008 American Institute of Chemical Engineers Process Saf Prog 2009

[1]  Erik Hollnagel,et al.  Epilogue: Resilience Engineering Precepts , 2006 .

[2]  Sidney Dekker,et al.  The re-invention of human error , 2001 .

[3]  D. Woods,et al.  Behind Human Error , 2010 .

[4]  J. Diamond Collapse : how societies choose to fail or survive , 2005 .

[5]  K. Weick The Collapse of Sensemaking in Organizations: The Mann Gulch Disaster , 1993 .

[6]  Dietrich Dörner,et al.  The Logic Of Failure: Recognizing And Avoiding Error In Complex Situations , 1997 .

[7]  K. WeiK. The Collapse of Sensemaking in Organizations: The Mann Gulch Disaster , 2009, STUDI ORGANIZZATIVI.

[8]  J. Creswell Qualitative inquiry and research design: choosing among five traditions. , 1998 .

[9]  David Woods,et al.  Behind human error : cognitive systems, computers, and hindsight : state-of-the-art report , 1994 .

[10]  P. King,et al.  Ten Questions About Human Error, A New View of Human Factors and System Safety - [Book reviews] , 2005, IEEE Engineering in Medicine and Biology Magazine.

[11]  Sidney Dekker,et al.  From Punitive Action to Confidential Reporting , 2007 .

[12]  Diane Vaughan,et al.  The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA , 1996 .

[13]  John McCarthy,et al.  Analysis of Procedure Following as Concerned Work , 2003 .

[14]  W. Barclay,et al.  Forgive and Remember: Managing Medical Failure , 1979 .

[15]  Richard I. Cook,et al.  Nine Steps to Move Forward from Error , 2002, Cognition, Technology & Work.

[16]  Lucy A. Suchman,et al.  Plans and Situated Actions: The Problem of Human-Machine Communication (Learning in Doing: Social, , 1987 .

[17]  Erik Hollnagel,et al.  Barriers And Accident Prevention , 2004 .

[18]  Sidney Dekker,et al.  The Field Guide to Human Error Investigations , 2006 .

[19]  Glyn Wittwer,et al.  Collapse: How societies choose to fail or survive , 2006 .

[20]  Robert L. Wears,et al.  Resilience Engineering: Concepts and Precepts , 2006, Quality and Safety in Health Care.

[21]  Erik Hollnagel,et al.  Resilience : The challenge of the unstable , 2006 .