Why the pilot cannot be blamed: a cautionary note about excessive reliance on technology

In many human endeavours intelligent automation has taken over some of the tasks traditionally performed by operators, pilots or controllers. The adoption of intelligent protective technology reflects the greater degree of reliability normally ascribed to such systems. Intelligent technology is often credited with saving lives and reducing accidents. This paper looks at the crash of a revolutionary supersonic fighter that resulted from over-reliance on protection technology. The degree of automation of the protection system made it impossible for the pilot to regain control and convince the system that there was a problem. Technology has thus created a new kind of computer-assisted error, where a system designed to make a task safer is actually directly responsible for causing a disaster. Developers thus need to foresee the impact of new technology in its original situational context and consider the implications of wresting control away from the pilot and giving it to the computer.

[1]  Anthony E. Ladd,et al.  Acceptable Risk? Making Decisions in a Toxic Environment. , 1989 .

[2]  Benjamin Kuipers,et al.  Computer power and human reason , 1976, SGAR.

[3]  R. Kasperson The social amplification of risk: progress in developing an integrative framework of risk’, in S. , 1992 .

[4]  W. Ashby,et al.  An Introduction to Cybernetics , 1957 .

[5]  Bonnie M. Muir,et al.  Trust Between Humans and Machines, and the Design of Decision Aids , 1987, Int. J. Man Mach. Stud..

[6]  J. G. Hollands,et al.  Engineering Psychology and Human Performance , 1984 .

[7]  Nancy G. Leveson Safety as a system property , 1995, CACM.

[8]  R. Rappaport,et al.  Risk and the Human Environment , 1996 .

[9]  W. Freudenburg Perceived risk, real risk: social science and the art of probabilistic risk assessment. , 1988, Science.

[10]  R. Kasperson,et al.  The Social Amplification and Attenuation of Risk , 1996 .

[11]  M. O'hare,et al.  Searching for Safety , 1990 .

[12]  Darren Dalcher Safety, risk, and danger:A New Dynamic Perspective , 2002 .

[13]  David C. Nagel,et al.  Human factors in aviation , 1988 .

[14]  J. Steinbruner The Cybernetic Theory of Decision , 1976 .

[15]  D. L. Simms,et al.  Normal Accidents: Living with High-Risk Technologies , 1986 .

[16]  D. E. Bell,et al.  PROBLEMS IN PRODUCING USABLE KNOWLEDGE FOR IMPLEMENTING LIBERATING ALTERNATIVES , 1988 .

[17]  K. Popper,et al.  Conjectures and refutations;: The growth of scientific knowledge , 1972 .

[18]  Nancy G. Leveson,et al.  The Role of Software in Recent Aerospace Accidents , 2001 .

[19]  K. Popper,et al.  The Logic of Scientific Discovery , 1960 .

[20]  Philip Powell,et al.  Information Systems: A Management Perspective , 1992 .

[21]  David E. Avison,et al.  Flexibility and the management of uncertainty: a risk management perspective , 1997 .

[22]  David Owen Air Accident Investigation , 1999 .

[23]  Gordon B. Davis,et al.  Management information systems : conceptual foundations, structure, and development , 1985 .

[24]  Dale Jamieson,et al.  Scientific Uncertainty and the Political Process , 1996 .