Learning Disabilities for Regulators

In hazardous industries, regulatory agencies confront the dual mission of enforcing regulations while learning from experience. This research examines how incentives, designed for rule enforcement, influence the gathering and interpretation of hazard-related information that is essential for learning. It explores a high reliability theory argument that strict adherence to standard operating procedures can coexist with organizational learning from mishaps. Drawing on interviews with participants in aviation safety monitoring systems, the research analyzes archival data. The findings suggest that incentives for compliance alternatively decreased or increased the availability of hazard-related iinformation, depending on the design system. The research supports a political theory of organizational reliability.

[1]  R. Brandis The Limits of Organization , 1975 .

[2]  Nancy Trembley FAA Air Traffic Activity , 1992 .

[3]  J. E. Groves,et al.  Made in America: Science, Technology and American Modernist Poets , 1989 .

[4]  E. Rosch,et al.  Cognition and Categorization , 1980 .

[5]  J. Dutton,et al.  Categorizing Strategic Issues: Links to Organizational Action , 1987 .

[6]  T. Laporte,et al.  Working in Practice But Not in Theory: Theoretical Challenges of “High-Reliability Organizations” , 1991 .

[7]  Diane Vaughan,et al.  The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA , 1996 .

[8]  J. Ruiz Moreno [Organizational learning]. , 2001, Revista de enfermeria.

[9]  T. W. van der Schaaf,et al.  Near Miss Reporting as a Safety Tool , 1991 .

[10]  T. L. Porte,et al.  A Rejoinder to Perrow , 1994 .

[11]  Eleanor Rosch,et al.  Principles of Categorization , 1978 .

[12]  A. Marcus,et al.  On the Edge: Heeding the Warnings of Unusual Events , 1999 .

[13]  D. Sills,et al.  Accident at Three Mile Island: The Human Dimensions , 1982 .

[14]  E. Lawler,et al.  Information and control in organizations , 1978 .

[15]  Arwen Mohun,et al.  The Limits of Safety: Organizations, Accidents, and Nuclear Weapons , 1993 .

[16]  James G. March,et al.  Learning from samples of one or fewer* , 1991 .

[17]  Joseph G. Morone,et al.  Averting Catastrophe: Strategies for Regulating Risky Technologies , 1986 .

[18]  P. Maurette,et al.  [To err is human: building a safer health system]. , 2002, Annales francaises d'anesthesie et de reanimation.

[19]  T. Lant,et al.  Organizational Cognition: Computation and Interpretation , 2000 .

[20]  Tjerk W. van der Schaaf Chapter 3 – A FRAMEWORK FOR DESIGNING NEAR MISS MANAGEMENT SYSTEMS , 1991 .

[21]  E. Rosch Cognitive reference points , 1975, Cognitive Psychology.

[22]  Deborah A. Lucas Chapter 11 – ORGANISATIONAL ASPECTS OF NEAR MISS REPORTING , 1991 .

[23]  R. Boruch,et al.  Mitigating Ethical Conflicts in Dual Mission Government Agencies , 1988 .

[24]  H. Simon,et al.  Organizations, 2nd ed. , 1993 .

[25]  Scott D. Sagan,et al.  Toward a Political Theory of Organizational Reliability , 1994 .