Coping with accelerating socio-technical systems

Abstract In 1997 a workshop was held on the subject of how to cope with accelerating technologies. These are industries such as aviation, air traffic control and biotechnology, amongst others, where the rate of technological development is significant. A number of papers were presented from a range of viewpoints and in a variety of different industrial contexts. The papers ranged from theoretical models to help understand the process of acceleration and its impact on organisational learning, to practical analyses of future potential risks in specific accelerating industries such as air traffic control. The aim of the papers and the workshop as a whole was to provide insight into the problems associated with accelerating technologies, and thereby derive measures to control or cope with such acceleration. The problems arising from acceleration, as predicted by the theoretical models and evidenced by experiences in accelerating industries (e.g. aviation) are manifold. Two examples of particular problems are unforeseen risks in an industry (a lack of forward vision), and a failure to learn adequately (i.e. in time) from incidents occurring in an organisation (a lack of constructive hindsight). There is also an incipient danger of society being driven by technology rather than being led by social needs. Even if each technology ultimately becomes ‘ultra-safe’, it will nevertheless have its own ‘event horizon’, limiting useful further progress. Each of the papers from the workshop is summarised and integrated into a three-part synopsis of the workshop. This covers the context of accelerating technologies, modelling their impacts, and deriving coping strategies. Four of the papers are included in their entirety as separate papers in their own right in this special issue of Safety Science.

[1]  Barry Kirwan,et al.  The role of the controller in the accelerating industry of air traffic management , 2001 .

[2]  Barry Kirwan,et al.  Development of a Hazard and Operability-based method for identifying safety management vulnerabilities in high risk systems , 1998 .

[3]  R. F. Griffiths,et al.  HAZOP and HAZAN: Notes on the identification and assessment of hazards : by T.A. Kletz, Institution of Chemical Engineers, Rugby, 1983, ISBN 0-85295-165-5, 81 pages, paperback, £8.00 incl. postage and packing. , 1984 .

[4]  Jos A. Rijpma,et al.  Complexity, Tight–Coupling and Reliability: Connecting Normal Accidents Theory and High Reliability Theory , 1997 .

[5]  Andrew Hale,et al.  Safety Management: The Challenge of Change , 2000 .

[6]  Ron Koppelberger,et al.  Space and Time , 2021, Nature.

[7]  Christopher D. Wickens,et al.  The Future of Air Traffic Control: Human Operators and Automation , 1998 .

[8]  Diane Vaughan,et al.  The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA , 1996 .

[9]  D. Norman,et al.  New technology and human error , 1989 .

[10]  Jop Groeneweg,et al.  Promoting safety in the oil industry. The Ergonomics Society Lecture Presented at the Ergonomics Society Annual Conference, Edinburgh, 13-16 April 1993 , 1994 .

[11]  Barry Kirwan,et al.  Human error in European air traffic management: the HERA project , 2002, Reliab. Eng. Syst. Saf..

[12]  Barry Kirwan,et al.  Safety Management Assessment and Task Analysis—A Missing Link? , 1998 .

[13]  Thomas L. Seamster,et al.  Applied Cognitive Task Analysis in Aviation , 1997 .

[14]  E. N. Corlett,et al.  Handbook of human reliability analysis with emphasis on nuclear power plant applications: A.D. Swain and H.E. Guttmann Draft report for interim use and comment. Office of Nuclear Regulatory Research, US Regulatory Commission, Washington DC 20555, pp 440 , 1981 .

[15]  Lisanne Bainbridge,et al.  Ironies of automation , 1982, Autom..

[16]  Charles E. Billings,et al.  Aviation Automation: The Search for A Human-centered Approach , 1996 .

[17]  Erik Hollnagel,et al.  Cognitive reliability and error analysis method , 1998 .

[18]  Barry Kirwan,et al.  A Guide to Practical Human Reliability Assessment , 1994 .

[19]  David Woods,et al.  Behind human error : cognitive systems, computers, and hindsight : state-of-the-art report , 1994 .

[20]  Ed M. Dougherty Context and human reliability analysis , 1993 .

[21]  Andrew Hale,et al.  Regulating airport safety: the case of Schiphol , 2001 .

[22]  Gerardo Rubino,et al.  Sojourn times in semi-Markov reward processes: application to fault-tolerant systems modeling , 1993 .

[23]  A. D. Swain,et al.  Handbook of human-reliability analysis with emphasis on nuclear power plant applications. Final report , 1983 .

[24]  N. Pidgeon,et al.  Man-made disasters: Why technology and organizations (sometimes) fail. , 2000 .

[25]  Klaus Eyferth,et al.  A model of the air traffic controller's picture , 2001 .

[26]  Jens Rasmussen,et al.  Risk management in a dynamic society: a modelling problem , 1997 .

[27]  Dietrich Doerner,et al.  On the Difficulties People Have in Dealing With Complexity , 1980 .

[28]  Barry Kirwan,et al.  A Guide To Task Analysis: The Task Analysis Working Group , 1992 .

[29]  Christopher D. Wickens,et al.  Flight to the Future: Human Factors in Air Traffic Control Edited by Christopher D. Wickens, Anne S. Mavor, & James P. McGee 1997, 368 pages, $44.95. Washington, DC: National Academy Press ISBN 0-309-05637-3 , 1997 .

[30]  René Amalberti,et al.  The paradoxes of almost totally safe transportation systems , 2001 .

[31]  Bernhard Wilpert,et al.  Nuclear Safety: A Human Factors Perspective , 1998 .

[32]  B. Turner Man Made Disasters , 1995 .

[33]  Ludwig Benner,et al.  Rating accident models and investigation methodologies , 1985 .