Modeling Safety-II based on unexpected reactor trips

Abstract Safety-I is defined as a state where as few things as possible go wrong. Until now, the methods for analyzing the safety in nuclear power plants (NPPs), i.e., Probabilistic Safety Assessment and Deterministic Safety Analysis, have been developed from the perspective of Safety-I. However, focusing solely on Safety-I may miss opportunities to 1) learn from successes, and 2) observe how human adaptation contributes to successful outcomes, despite novel circumstances and resource limitations. In this light, a paradigm shift from ensuring that “as few things as possible go wrong (Safety-I)” to ensuring that “as many things as possible go right (Safety-II)” has been suggested. This study aimed to develop a model of Safety-II for unexpected situations in NPPs. Safety-II concentrates on the conditions in which as many things as possible go right. First, this study suggested and characterized a qualitative Safety-II model, which was modified from a resilience model developed by the Electricite de France (EDF). Second, event reports from unplanned reactor trips at Korean NPPs were analyzed based on the elements of this characterized Safety-II model as well as event severity. Third, a quantitative network model of Safety-II was developed by performing a correlation analysis. Finally, a feasibility analysis of Safety-I and Safety-II concepts for explaining event severity was performed. The result of this research suggests a new methodology for the safety assessment of unexpected reactor trips in NPPs, which could complement the conventional probabilistic safety assessments and deterministic safety analyses.

[1]  Wondea Jung,et al.  A DATABASE FOR HUMAN PERFORMANCE UNDER SIMULATED EMERGENCIES OF NUCLEAR POWER PLANTS , 2005 .

[2]  Nancy G. Leveson,et al.  A new accident model for engineering safer systems , 2004 .

[3]  Christopher Nemeth,et al.  Resilience Engineering in Practice, Volume 2: Becoming Resilient , 2014 .

[4]  Bengt Pershagen,et al.  Light water reactor safety , 1989 .

[5]  M. Sam Mannan,et al.  Resilience engineering of industrial processes: Principles and contributing factors , 2012 .

[6]  Robert L. Wears,et al.  Standardisation and its discontents , 2014, Cognition, Technology & Work.

[7]  James T. Reason,et al.  Managing the risks of organizational accidents , 1997 .

[8]  Eduardo Salas,et al.  Training team problem solving skills: an event-based approach ? ? The views herein are those of the , 1999 .

[9]  Frédéric Vanderhaegen,et al.  How to learn from the resilience of Human-Machine Systems? , 2013, Eng. Appl. Artif. Intell..

[10]  Marcos R. S. Borges,et al.  Using group storytelling to recall information in emergency response , 2011, 7th International Conference on Collaborative Computing: Networking, Applications and Worksharing (CollaborateCom).

[11]  C. Nemeth Resilience Engineering Perspectives, Volume 1: Remaining Sensitive to the Possibility of Failure , 2008 .

[12]  Yves Dien Safety and application of procedures, or `how do `they' have to use operating procedures in nuclear power plants?' , 1998 .

[13]  K. Takano,et al.  Multivariate Analysis of Human Error Incidents Occurring at Nuclear Power Plants: Several Occurrence Patterns of Observed Human Errors , 2001, Cognition, Technology & Work.

[14]  David Woods,et al.  Human factors challenges in process control: The case of nuclear power plants. , 1987 .

[15]  Wan Chul Yoon,et al.  A model-based framework for the analysis of team communication in nuclear power plants , 2009, Reliab. Eng. Syst. Saf..

[16]  Angelica Vecchio-Sadus ENHANCING SAFETY CULTURE THROUGH EFFECTIVE COMMUNICATION , 2007 .

[17]  C. Frerk,et al.  A new view of safety: Safety 2. , 2015, British journal of anaesthesia.

[18]  Howard Kunreuther,et al.  Near‐Miss Incident Management in the Chemical Process Industry , 2003, Risk analysis : an official publication of the Society for Risk Analysis.

[19]  Susan J. Reinartz,et al.  Verbal communication in collective control of simulated nuclear power plant incidents , 1992 .

[20]  Dana E. Sims,et al.  Is there a “Big Five” in Teamwork? , 2005 .

[21]  Eduardo Salas,et al.  The Role of Shared Mental Models in Developing Team Situational Awareness: Implications for Training , 2017 .

[22]  David Woods,et al.  SOME RESULTS ON OPERATOR PERFORMANCE IN EMERGENCY EVENTS , 1984 .

[23]  Yung-Tsan Jou,et al.  Analyzing the staffing and workload in the main control room of the advanced nuclear power plant from the human information processing perspective , 2013 .

[24]  David D. Woods,et al.  Distant Supervision–Local Action Given the Potential for Surprise , 2000, Cognition, Technology & Work.

[25]  Joel Cutcher-Gershenfeld,et al.  Modeling, Analyzing, and Engineering NASA's Safety Culture , 2005 .

[26]  Erik Hollnagel,et al.  Safety-I and Safety-II: The Past and Future of Safety Management , 2014 .

[27]  David Woods,et al.  Incidents – Markers of Resilience or Brittleness? , 2017 .

[28]  Rhona Flin,et al.  Identifying and training non-technical skills of nuclear emergency response teams , 2004 .

[29]  M. Hoegl,et al.  Teamwork Quality and the Success of Innovative Projects , 2001 .

[30]  M. Hoegl,et al.  When teamwork really matters: task innovativeness as a moderator of the teamwork–performance relationship in software development projects , 2003 .

[31]  Jonas Lundberg,et al.  Systemic resilience model , 2015, Reliab. Eng. Syst. Saf..

[32]  Terje Aven,et al.  A risk perspective suitable for resilience engineering , 2011 .

[33]  Herbert H. Bell,et al.  Training Decision Makers for Complex Environments: Implications of the Naturalistic Decision Making Perspective , 1997 .

[34]  David Woods,et al.  Resilience Engineering: Concepts and Precepts , 2006 .