INTERVENTION STRATEGIES FOR THE MANAGEMENT OF HUMAN ERROR

This report examines the management of human error in the cockpit. The principles probably apply as well to other applications in the aviation realm (e.g. air traffic control, dispatch, weather, etc.) as well as other high-risk systems outside of aviation (e.g. shipping, high-technology medical procedures, military operations, nuclear power production). Management of human error is distinguished from error prevention. It is a more encompassing term, which includes not only the prevention of error, but also a means of disallowing an error, once made, from adversely affecting system output. Such techniques include: traditional human factors engineering, improvement of feedback and feedforward of information from system to crew, 'error-evident' displays which make erroneous input more obvious to the crew, trapping of errors within a system, goal-sharing between humans and machines (also called 'intent-driven' systems), paperwork management, and behaviorally based approaches, including procedures, standardization, checklist design, training, cockpit resource management, etc. Fifteen guidelines for the design and implementation of intervention strategies are included.

[1]  C. E. Billings,et al.  Information transfer problems in the aviation system , 1981 .

[2]  Kim J. Vicente,et al.  Coping with Human Errors through System Design: Implications for Ecological Interface Design , 1989, Int. J. Man Mach. Stud..

[3]  Donald A. Norman,et al.  Design rules based on analyses of human error , 1983, CACM.

[4]  Earl L. Wiener,et al.  Beyond the Sterile Cockpit , 1985 .

[5]  David D. Woods,et al.  Modeling and predicting human error , 1989 .

[6]  David C. Nagel,et al.  Human factors in aviation , 1988 .

[7]  Earl L. Wiener,et al.  Fallible humans and vulnerable systems: lessons learned from aviation , 1987 .

[8]  D. Norman Categorization of action slips. , 1981 .

[9]  R. John Hansman,et al.  Identification of Important "Party Line" Information Elements and the Implications for Situational Awareness in the Datalink Environment , 1992 .

[10]  W. P. Monan Addressee Errors in ATC Communications: The Call Sign Problem , 1983 .

[11]  Karol Kerns DATA-LINK COMMUNICATION BETWEEN CONTROLLERS AND PILOTS: A REVIEW AND SYNTHESIS OF THE SIMULATION LITERATURE. , 1991 .

[12]  David Woods,et al.  Cognitive consequences of clumsy automation on high workload, high consequence human performance , 1991 .

[13]  Mark R. Rosekind,et al.  Alertness Management in Flight Operations: Strategic Napping , 1991 .

[14]  Earl L. Wiener,et al.  Human factors of advanced technology (glass cockpit) transport aircraft , 1989 .

[15]  William B. Rouse Designing for Human Error: Concepts for Error Tolerant Systems , 1990 .

[16]  J. Shaoul Human Error , 1973, Nature.

[17]  Robert L. Helmreich,et al.  Group interaction and flight crew performance , 1988 .

[18]  E L Wiener,et al.  Application of Vigilance Research: Rare, Medium, or Well Done? , 1987, Human factors.

[19]  John W. Senders,et al.  Human Error: Cause, Prediction, and Reduction , 1991 .

[20]  Harold E. Price,et al.  The Allocation of Functions in Systems , 1985 .

[21]  F. Steyvers,et al.  Vigilance and performance in automated systems , 1989 .

[22]  Earl L. Wiener Cockpit Automation: In Need of a Philosophy , 1985 .

[23]  David D. Woods,et al.  Cognitive Technologies: The Design of Joint Human-Machine Cognitive Systems , 1986, AI Mag..

[24]  Sandra H. Rouse,et al.  Analysis and classification of human error , 1983, IEEE Transactions on Systems, Man, and Cybernetics.

[25]  D A Norman,et al.  The 'problem' with automation: inappropriate feedback and interaction, not 'over-automation'. , 1990, Philosophical transactions of the Royal Society of London. Series B, Biological sciences.

[26]  Raja Parasuraman,et al.  Automation-Related Complacency: A Source of Vulnerability in Contemporary Organizations , 1992, IFIP Congress.

[27]  Jens Rasmussen,et al.  Human errors. a taxonomy for describing human malfunction in industrial installations , 1982 .

[28]  R. Curtis Graeber,et al.  Aircrew fatigue and circadian rhythmicity , 1988 .

[29]  Asaf Degani,et al.  Philosophy, policies, and procedures - The three P's of flight-deck operations , 1991 .

[30]  Renwick E. Curry,et al.  Flight-deck automation: promises and problems , 1980 .

[31]  Asaf Degani,et al.  Who or what saved the day? A comparison of traditional and glass cockpits , 1991 .

[32]  E F Weener,et al.  KEY ELEMENTS OF ACCIDENT AVOIDANCE , 1992 .

[33]  Elizabeth S. Veinott,et al.  Analysis of communication in the standard versus automated aircraft , 1993 .

[34]  Raja Parasuraman,et al.  Performance Consequences of Automation-Induced 'Complacency' , 1993 .

[35]  George A. Sexton Cockpit–Crew Systems Design and Integration , 1988 .

[36]  Earl L. Wiener Controlled Flight into Terrain Accidents: System-Induced Errors , 1977 .

[37]  T. A. Demosthenes,et al.  Design Principles for Commercial Transport Aircraft: A Pilot's Perspective , 1989 .

[38]  F. Hawkins Human factors in aviation. , 1979, Journal of psychosomatic research.

[39]  David C. Nagel,et al.  Human Error in Aviation Operations , 1988 .

[40]  Barbara G. Kanki COMMUNICATION AND CREW RESOURCE MANAGEMENT. , 1993 .

[41]  Robert C. Williges,et al.  Software Interfaces for Aviation Systems , 1988 .

[42]  Phyllis J Kayten THE ACCIDENT INVESTIGATOR'S PERSPECTIVE. , 1993 .