An algorithm for classifying error types of front-line workers based on the SRK framework

Abstract Although there are many classifications of error types, literature provides little guidance on how to systematically classify an event into the proposed error type categories. This research introduces an algorithm for classifying error types of front-line workers involved in occupational incidents, based on the skill–rule–knowledge (SRK) framework. While the original version of the algorithm was tested in a heavy machinery manufacturer (study 1), in which 36 accidents were analyzed, an improved version was tested at an oil distribution company (study 2), in which the analysis encompassed 20 accidents and 14 near misses. The resulting distribution of error types in both studies 1 and 2 was respectively as follows: slips (42% and 12.2%); memory lapses (0% and 2.4%); violations (17% and 7.3%); knowledge-based errors (11% and 0%); and no worker error (30% and 78.1%). The incident causes attributed by both companies’ safety staffs were re-classified based on a uniform terminology and then they were associated with the sub-systems of a socio-technical system. The results of this analysis for both studies 1 and 2 were respectively as follows: technological sub-system (37.3% and 42.8% of all causes); work design sub-system (31.4% and 38.1% of all causes); personnel sub-system (31.4% and 19.0% of all causes). The external environment sub-system was not associated with any cause probably because it was ignored during the original investigations conducted by the safety staff of the enterprises. In study 2, an analysis was also carried out to track the pathways followed by the investigators through the algorithm. In 63.3% of the investigations, the analysts had to answer 5 out of 10 questions when using the algorithm. Relevance to industry This paper introduces a tool that helps to elucidate the nature of front-line workers involvement in occupational incidents. In particular, the tool allows the classification of the error-types involved in each incident, which is important since different error types require different safety management strategies. The application of the tool typically takes a few minutes and the investigators must answer no more than seven questions for each incident. Two case studies—one of them in a heavy machinery manufacturer and the other at an oil distribution plant—illustrate how the tool should be applied and how the results should be analyzed.

[1]  Hal W. Hendrick,et al.  Macroergonomics: An Introduction to Work System Design , 2000 .

[2]  David Woods,et al.  Behind human error : cognitive systems, computers, and hindsight : state-of-the-art report , 1994 .

[3]  Albert Boquet,et al.  Human Error and Commercial Aviation Accidents: An Analysis Using the Human Factors Analysis and Classification System , 2007, Hum. Factors.

[4]  Dianne Parker,et al.  Individual Differences in Accident Liability: A Review and Integrative Approach , 1998, Hum. Factors.

[5]  Sidney Dekker,et al.  The Field Guide to Human Error Investigations , 2006 .

[6]  M. D. Cooper Towards a model of safety culture , 2000 .

[7]  Shappell Sa,et al.  U.S. naval aviation mishaps, 1977-92: differences between single- and dual-piloted aircraft. , 1996 .

[8]  Mark S. Sanders,et al.  Human Factors in Engineering and Design , 1957 .

[9]  A. Roy Duff,et al.  Development of Causal Model of Construction Accident Causation , 2001 .

[10]  David P. Baker,et al.  Development and Validation of Aviation Causal Contributors for Error Reporting Systems (ACCERS) , 2007, Hum. Factors.

[11]  Nadine Sarter,et al.  Cognitive Engineering in the Aviation Domain , 2009 .

[12]  James T. Reason,et al.  Managing the risks of organizational accidents , 1997 .

[13]  Andrew R. Atkinson Human error in the management of building projects , 1998 .

[14]  Lia Buarque de Macedo Guimarães,et al.  Safety and production: an integrated planning and control model , 2004 .

[15]  Carlos Torres Formoso,et al.  Analysis of a safety planning and control model from the human error perspective , 2005 .

[16]  Erik Hollnagel,et al.  Barriers And Accident Prevention , 2004 .

[17]  Bruce M. Perrin,et al.  Information Order and Outcome Framing: An Assessment of Judgment Bias in a Naturalistic Decision-Making Context , 2001, Hum. Factors.

[18]  Jens Rasmussen,et al.  Human errors. a taxonomy for describing human malfunction in industrial installations , 1982 .

[19]  Jens Rasmussen,et al.  Cognitive Systems Engineering , 2022 .