Loss prevention at the startup stage in process safety management: From distributed cognition perspective with an accident case study

Abstract Organizational artifacts such as rules, procedures, or codes of practice play important roles in chemical process safety management. The violation of an organizational artifact can lead to major accidents. It is thus essential to understand the underlying significance of violations to coordination of organizational artifacts. Currently, there are rapid changes and development of chemical products which require developing new process in chemical plants. The pre-startup stage in the chemical process is especially prone to errors due to ill-defined work processes and the variety of jobs involved at this stage. However, human errors and understandings in the stage have not been given much of attention it deserves. This paper focuses on human errors in pre-startup stage: (1) An analysis of major industrial accidents in Korea was conducted. (2) A survey of related personnel in chemical plants was carried out to understand conditions of safety management on human error. (3) An accident case study at the pre-startup stage in a Korean chemical plant was conducted by applying the Fault Tree Analysis method to reveal failures of coordination of organizational artifacts. This paper adopted distributed cognition theories to analyze the underlying implications of such failures because these theories have the power to examine the relationship between human operators and artifacts in a system. This study highlights the importance of considering organizational artifacts related to human error in safety management. The results may help people, who are related to a process of pre-startup stage in chemical plants, to improve safety of the work.

[1]  Donald A. Norman,et al.  Things That Make Us Smart: Defending Human Attributes In The Age Of The Machine , 1993 .

[2]  P. Slovic Perception of risk. , 1987, Science.

[3]  James D. Hollan,et al.  Distributed cognition: toward a new foundation for human-computer interaction research , 2000, TCHI.

[4]  J. Reason The human contribution : unsafe acts, accidents and heroic recoveries , 2008 .

[5]  Jos A. Rijpma,et al.  Complexity, Tight–Coupling and Reliability: Connecting Normal Accidents Theory and High Reliability Theory , 1997 .

[6]  K. Weick,et al.  Organizing for high reliability: Processes of collective mindfulness. , 1999 .

[7]  D. E. Embrey,et al.  Incorporating management and organisational factors into probabilistic safety assessment , 1992 .

[8]  D. Woods,et al.  Behind Human Error , 2010 .

[9]  Jeremy Busby,et al.  The coordinating role of organisational artefacts in distributed cognitions –and how it fails in maritime operations , 2006 .

[10]  Jeremy Busby,et al.  Mutual misconceptions between designers and operators of hazardous systems , 2002 .

[11]  Hyuck-Myun Kwon The effectiveness of process safety management (PSM) regulation for chemical industry in Korea , 2006 .

[12]  D. Norman Categorization of action slips. , 1981 .

[13]  Jeremy Busby,et al.  Error and distributed cognition in design , 2001 .

[14]  N. Meshkati Human factors in large-scale technological systems' accidents: Three Mile Island, Bhopal, Chernobyl , 1991 .

[15]  Hans J. Pasman,et al.  Safety of the process industries in the 21st century: A changing need of process safety management for a changing industry , 2009 .

[16]  In Jae Shin,et al.  Development of a Theory-Based Ontology of Design-Induced Error , 2005 .

[17]  E. Hutchins Cognition in the wild , 1995 .

[18]  Wondea Jung,et al.  A taxonomy of performance influencing factors for human reliability analysis of emergency tasks , 2003 .

[19]  Jens Rasmussen,et al.  Risk management in a dynamic society: a modelling problem , 1997 .

[20]  James T. Reason,et al.  Managing the risks of organizational accidents , 1997 .

[21]  Richard A. Krueger,et al.  Focus groups : a practical guide for applied research / by Richard A. Krueger , 1989 .

[22]  Mica R. Endsley,et al.  Toward a Theory of Situation Awareness in Dynamic Systems , 1995, Hum. Factors.

[23]  Jens Rasmussen,et al.  Skills, rules, and knowledge; signals, signs, and symbols, and other distinctions in human performance models , 1983, IEEE Transactions on Systems, Man, and Cybernetics.

[24]  Scott A. Shappell,et al.  A Human Error Analysis of General Aviation Controlled Flight Into Terrain Accidents Occurring Between 1990-1998 , 2003 .

[25]  Zahra Mohaghegh,et al.  Incorporating organizational factors into Probabilistic Risk Assessment (PRA) of complex socio-technical systems: A hybrid technique formalization , 2009, Reliab. Eng. Syst. Saf..

[26]  Edwin Hutchins How a Cockpit Remembers Its Speeds , 1995 .

[27]  D. L. Simms,et al.  Normal Accidents: Living with High-Risk Technologies , 1986 .

[28]  Ali Mosleh,et al.  Incorporating organizational factors into probabilistic safety assessment of nuclear power plants through canonical probabilistic models , 2007, Reliab. Eng. Syst. Saf..

[29]  James D. Hollan,et al.  Direct Manipulation Interfaces , 1985, Hum. Comput. Interact..