Sizeable and long lasting reductions in adverse events cannot be realized unless decision makers at all levels pay attention to the global system phenomenon of inadvertent harm to patients.
Many healthcare providers now know that patient safety poses a significant risk to public health. The American statistics in particular are frequently cited: preventable medical error is the eighth leading cause of death, it is responsible for 44 000–98 000 deaths annually in hospitals alone, and it results in patient injuries that cost between $17 billion and $29 billion annually.1
Virtually all of the medical experts who have written on this topic have stated that the key to improving patient safety is to apply system design principles from human factors engineering.1,2 This discipline aims to tailor the design of technology to conform to human nature rather than expect people to contort and adapt to technology. By doing so, systems become easier for people to work in, ultimately reducing error. Human factors techniques have been applied to other industries, such as nuclear power and aviation, and have been very successful in reducing error and improving safety in these contexts.
If the magnitude of the problem is significant and widely known, and if there is a consensus on the likely remedy, then why has not more progress been made on improving patient safety? One possibility is that human factors engineering has traditionally been primarily concerned with “knobs and dials” or “graphical user …
[1]
K. J. Vicente,et al.
Cognitive Work Analysis: Toward Safe, Productive, and Healthy Computer-Based Work
,
1999
.
[2]
Kim J. Vicente,et al.
Patient Safety, Potential Adverse Drug Events, and Medical Device Design: A Human Factors Engineering Approach
,
2001,
J. Biomed. Informatics.
[3]
Bryan A. Liang.
Error in Medicine: Legal Impediments to U.S. Reform
,
1999
.
[4]
L. Kohn,et al.
To Err Is Human : Building a Safer Health System
,
2007
.
[5]
D. Gaba,et al.
Anesthesia crisis resource management training: teaching anesthesiologists to handle critical incidents.
,
1992,
Aviation, space, and environmental medicine.
[6]
Jens Rasmussen,et al.
Risk management in a dynamic society: a modelling problem
,
1997
.
[7]
Erik Hollnagel,et al.
Cognitive Systems Engineering: New Wine in New Bottles
,
1983,
Int. J. Man Mach. Stud..
[8]
L. Leape.
Error in Medicine
,
1994
.
[9]
S. Kraman,et al.
Risk Management: Extreme Honesty May Be the Best Policy
,
1999,
Annals of Internal Medicine.