Human factors in accidents*

At first sight, an article on human error based largely on military aviation accidents may appear to be inappropriate material for this journal, particularly when it is written by one whose total knowledge of anaesthesia has been confined to two sessions in a dentist’s chair; but a moment’s reflection may show that errors in the air and errors in the operating theatre have much in common. Thus both pilots and doctors are carefully selected highly trained professionals who are usually determined to maintain high standards, both externally and internally imposed, whilst performing difficult tasks in life-threatening environments. Both use high technology equipment and function as key members of a team of specialists, although not always with colleagues of their choosing, and are sometimes forced to operate at a time and under conditions which are far from ideal. Finally, they both exercise high level cognitive skills in a most complex domain about which much is known, but where much remains to be discovered; aeronautics, medicine, meteorology, pharmacology, etc. continue to be very active research areas. Both pilots and doctors make many errors—that is, errors as defined by the strictest criterion of “performance which deviates from the ideal”. However, the vast majority of the errors which they commit either are trivial or are easily rectified; thus an approach speed which is a knot or so too fast, or a poorly-worded communication, will probably dent only professional pride. Indeed, for all honest people, each day contains a plethora of trivial errors such as forgetting to fill the kettle, stopping at a green light, or failing to notice the duplication of a word in a sentence. Usually there is sufficient slack in the system for the error to be ignored or noticed and corrected, but some apparently innocuous errors are not noticed and some systems are not so forgiving as others; for example, a high performance aircraft or a nuclear power plant will function through a host of complex interactions and be what engineers describe as “tightly coupled” (Perrow, 1984). That is to say that what happens in one part of the system directly, and often very quickly, affects other parts. Thus recovery from a control error when flying at high speed, low level may not be possible, whereas the same error in the cruise might barely occasion comment. Therefore, for both pilot and doctor one of their frequent errors may, very occasionally, lead to a catastrophe or, in the often quoted words of Cherns, “An accident is an error with sad consequences” (Cherns, 1962).

[1]  R. Yerkes,et al.  The relation of strength of stimulus to rapidity of habit‐formation , 1908 .

[2]  D. Broadbent Word-frequency effect and response bias. , 1967, Psychological review.

[3]  David Woods,et al.  SOME RESULTS ON OPERATOR PERFORMANCE IN EMERGENCY EVENTS , 1984 .

[4]  J M Rolfe Ergonomics and air safety. , 1972, Applied ergonomics.

[5]  Sandra H. Rouse,et al.  Analysis and classification of human error , 1983, IEEE Transactions on Systems, Man, and Cybernetics.

[6]  S. Fisher,et al.  Stress and The Perception Of Control , 1984 .

[7]  Kenyon B. De Greene,et al.  Major conceptualproblems in the systems management of human factors/ergonomics research , 1980 .

[8]  Jens Rasmussen,et al.  Models of Mental Strategies in Process Plant Diagnosis , 1981 .

[9]  N. Schmitt Social and Situational Determinants of Interview Decisions: Implications for the Employment Interview. , 1976 .

[10]  E. C. Poulton,et al.  Environment and human efficiency , 1972 .

[11]  William B. Rouse,et al.  Models of human problem solving: Detection, diagnosis, and compensation for system failures , 1982, Autom..

[12]  J. G. Hollands,et al.  Engineering Psychology and Human Performance , 1984 .

[13]  D. Kuhn,et al.  Judgements under uncertainty: Heuristics and biases , 1984 .

[14]  F. Craik,et al.  Levels of Pro-cessing: A Framework for Memory Research , 1975 .

[15]  J Reason Recurrent errors in process environments: some implications for the design of intelligent support systems , 1986 .

[16]  J. C. Flanagan Psychological Bulletin THE CRITICAL INCIDENT TECHNIQUE , 2022 .

[17]  E. Loftus,et al.  Reconstruction of automobile destruction: An example of the interaction between language and memory , 1974 .

[18]  Victor A. Byrnes,et al.  Human Factors in Air Transportation , 1953 .

[19]  T. A. Ratcliffe Society: Problems and Methods of Study , 1964, Mental Health.

[20]  W. Curran REPORT OF THE PRESIDENT'S COMMISSION. , 1965, The New England journal of medicine.

[21]  Giuseppe Mancini,et al.  Intelligent Decision Support in Process Environments , 1986, NATO ASI Series.

[22]  Elizabeth F. Loftus,et al.  Eyewitness testimony : psychological perspectives , 1984 .

[23]  Anthony J. Sanford,et al.  Cognition and Cognitive Psychology , 1985 .

[24]  D. Spalding The Principles of Psychology , 1873, Nature.