Ambiguity and Workarounds as Contributors to Medical Error

The Quality Grand Rounds series in Annals illustrates how work-system conditions can produce errors and adverse events (1). The human cost of medical error provided incentive for such studies (2-5). In one case, a nurse mistakenly used insulin rather than heparin to flush the arterial line of a patient, Mrs. Grant, causing severe hypoglycemia, seizures, coma, and, ultimately, death (6). In another, unreliable processes for identifying patients, assuring consent, and exchanging information led to a Mrs. Morris being mistaken for a Mrs. Morrison; as a result the patient was subjected to an unnecessary, potentially dangerous electrophysiologic examination (7). We ask: Do medical errors such as these have common root causes? Can lessons to improve reliability be drawn from nonhealth care organizations that overcome the potential for catastrophe brought on by work complexity, knowledge intensiveness, and variety and volatility of circumstance (8)? The answer to both questions is yes. Error-prone organizations tolerate ambiguity, a lack of clarity about what is expected to happen when work proceeds. Therefore, defining what constitutes a problem is difficult for several aspects of work. It is not perfectly clear 1) what the workgroup is trying to achieve; 2) who is responsible for what tasks; 3) how to exchange information, materials, or services; or 4) exactly how to perform tasks. Moreover, even when recognized, problems are worked around; people improvise to get the job done, even when indicators suggest something amiss. They fail to contain problems or improve processes, leaving factors that confounded one person's work to confound again. In contrast, superlative organizations design work as series of ongoing experiments by consistently specifying how to do work. Specification makes clear what is expectedwho is to be where, who should be doing what, and what results should occur. When specifications deviate from actual experience, these organizations promptly investigate the deviations to prevent them from causing harm or recurring (Table). Table. Contrasting Error-Prone and High-Performing Organizations Contributions of Ambiguity and Workarounds to Medical Errors Mrs. Grant stabilized after cardiac surgery, allowing reasonable clarity about what additional care she needed (aspect 1); who was responsible for one element of that care, flushing the arterial line (aspect 2); and the fact that her nurse knew he needed to perform that task (aspect 3)evidenced by his responding to an alarm indicating an occlusion. While there was clarity concerning how to flush the line (aspect 4), heparin and insulin were difficult to differentiate. Both were stored in vials of similar size, shape, weight, and location; and once in a syringe, the drugs are indistinguishable because they are both colorless. Lack of clarity meant that the nurse could not tell whether he had done his job correctly. As for the contribution of workarounds to the tragedy, nurses probably had previously chosen insulin rather than heparin but had corrected the error before administration. (Bates [9] estimates that incorrect drug administrations outnumber patient harm by a ratio of 100 to 1.) Switching the right drug for the wrong one, without reducing the chance of confusing the two again, preserved the potential for recurrence. Seventeen errors were identified in the Morris/Morrison case. Among these was miscommunication (aspect 3) between the nurse who was looking for Mrs. Morrison and someone who thought this nurse was seeking Mrs. Morris. Errors in task performance (aspect 4) included a nurse's incorrect report that Mrs. Morrison had been transferred and the laboratory's failure to verify Mrs. Morris's identity. The danger of workarounds is evident in the decision of Mrs. Morris's care team to continue her transport despite the absence of an order or a signed consent form in her chart, and even over the patient's objections. A resident caring for Mrs. Morris did not intervene when he found the laboratory doing the unexpected procedure; rather, he assumed that the attending physician had not informed him of a study, a failure in communication that had occurred before. High-Performing Systems: Specification and Immediate Problem Solving In pursuing quality, safety, productivity, and flexibility, leaders in other industries specify exactly what is expected in the 4 aspects of work described above. In so doing, they create the opportunity to be surprised, allowing workers to recognize deviations from the expectations implied by the original specification. Then, once surprised, these leaders treat discrepancies as something that is not normal and should be investigated immediately. This approach contains problems, generates knowledge, and leads to improvements. For example, aircraft carriers are dangerous workplaces because of severe weather, limited visibility, rapid changes in mission, and continuous arrivals and departures of aircraft, all needing the same limited deck space, equipment, and crew. Despite these dangers, flight operations are typically safe. Work is highly specified, even for circumstances in which a change in situation requires a change in roles. Crews color-code uniforms, demarcate spaces on the deck, and define what is to be done during launches and recoveries. Aberrations, such as someone being out of position, quickly make it obvious that operations cannot continue as if all were normal (10). Southwest Airlines is faster and more accurate than its competitors at the critical process of flight departuresdespite having to coordinate specialized employees amid the vagaries of weather, airport congestion, mechanical failures, and load fluctuations. It specifies what must be done to ensure a smooth departure and also to make it evident even when the situation has changed (and thereby requiring a different but also specified plan) (11). Toyotaa leader in the complex work of product design (12, 13), new-model introduction (14), and production (15, 16)specifies how work is to be done so that even small deviations from expectations (whether in routine work or in highly complex unique efforts such as new-model launches and disaster recovery [17]) are evident. Once detected, problems are promptly investigated (15, 18, 19) and contained, and information relevant to understanding them is fresh and easier to accurately reconstruct than it would be if problem solving were delayed (20-22). Examples in Health Care Some health care organizations have successfully tested highly specifying processes. The Shock, Trauma, and Respiratory intensive care unit at LDS Hospital in Salt Lake City, Utah, developed protocols to better control glucose levels, decrease nosocomial infection rates, and reduce costs. These protocols are noteworthy because once developed, they were often changed as users encountered problems applying them (23). Thompson and colleagues (24) reported on how hospitals reduced ambiguity and workarounds. In one hospital, on each shift nurses averaged 23 searches for keys to the narcotics cabinet; this wasted 49 minutes per shift and delayed analgesia to patients. Rather than tolerate continued searches, administrators tested assigning numbered keys at the start of each shift, with safeguards to prevent loss or misuse. This procedure nearly eliminated searches for keys and saved 2895 nurse-hours yearly in a 350-bed hospital. Another hospital's pharmacy used deviations from design to trigger process improvement, not workarounds. Without any technology investments, searches for missing medication decreased by 60% and stockouts fell by 85%. Avoiding Ambiguity and Workarounds in the Annals Cases What difference might similar procedures have made in Mrs. Grant's case? The first time a nurse saw that the patient had taken a wrong drug, an investigation would have been initiated to discover why selecting the wrong item was so easy. Insulin and heparin might then have been stocked in distinctive vials or location before someone else could err again. In Mrs. Morrison's case, hospital staff would have specified a particular time for Mrs. Morrison's electrophysiologic examination (aspect 1). This would have resulted in well-specified assignments for who was responsible for transport (aspect 2), the manner in which the electrophysiology laboratory was to request the next patient (aspect 3), and how staff would identify patients and obtain consent (aspect 4). With such clarity about what was supposed to happen, staff would have seen that Mrs. Morrison's situation (being left unprepared for a test and not being transported as expected) was contrary to expectations and would have treated it as a problem. In turn, then, they would have immediately stopped work and triggered investigation, problem-solving, and process improvement. Conclusions By meticulous specification of who should supply what goods, information, or services, to whom, in what fashion, and when, problems can be identified, often before they produce adverse events. With this sort of system, the consequences of problems do not propagate, and investigations result in design changes that reduce the likelihood of recurrence. But how does one start, given that health care seems to be unique in its extraordinary complexity? Every patient presents unique features, diagnostic and therapeutic methods change quickly, the consequences of error can be profound, and the needs of several patients often must be met concurrently. Start small. There is no need to specify an entire system at once. As with the examples from Thompson and colleagues and LDS Hospital, small pieces of larger systems can be specified. As problems reveal themselves, other items that need to be specified become more evident. At the same time, the process teaches important lessons in applying and internalizing these principles. Start simple. Much of what patients require, and the fact that meeting these needs sometimes results in error,

[1]  Adler,et al.  Flexibility versus efficiency? A case study of model changeovers in the Toyota production system , 1999 .

[2]  S. Spear,et al.  Decoding the DNA of the Toyota Production System , 1999 .

[3]  T. van der Schaaf Medical applications of industrial safety science , 2002, Quality & safety in health care.

[4]  Steven J. Spear,et al.  The essence of just-in-time: Embedding diagnostic tests in work-systems to achieve operational excellence , 2002 .

[5]  J. Burke,et al.  Infection control - a problem for patient safety. , 2003, The New England journal of medicine.

[6]  K. Weick,et al.  Collective mind in organizations: Heedful interrelating on flight decks. , 1993 .

[7]  Ramachandran Jaikumar,et al.  A dynamic approach to operations management: An alternative to static optimization , 1992 .

[8]  R. Hayward,et al.  Are Bad Outcomes from Questionable Clinical Decisions Preventable Medical Errors? A Case of Cascade Iatrogenesis , 2002, Annals of Internal Medicine.

[9]  Kaveh G Shojania,et al.  Learning from Our Mistakes: Quality Grand Rounds, a New Case-Based Series on Medical Errors and Patient Safety , 2002, Annals of Internal Medicine.

[10]  T. W. Schaaf Medical applications of industrial safety science , 2002 .

[11]  M. Chassin,et al.  The Wrong Patient , 2002, Annals of Internal Medicine.

[12]  W. R. Jarvis,et al.  Infection control and changing health-care delivery systems. , 2001, Emerging infectious diseases.

[13]  T P Clemmer,et al.  Results of a collaborative quality improvement program on outcomes and costs in a tertiary critical care unit. , 1999, Critical care medicine.

[14]  Jody Hoffer Gittell,et al.  The Southwest Airlines Way : Using the Power of Relationships to Achieve High Performance , 2002 .

[15]  S. Spear Learning to lead at Toyota. , 2004, Harvard business review.

[16]  R. Weinstein,et al.  Nosocomial infection update. , 1998, Emerging infectious diseases.

[17]  Gail A. Wolf,et al.  Driving Improvement in Patient Care: Lessons From Toyota , 2003, The Journal of nursing administration.

[18]  Durward K. Sobek,et al.  The Second Toyota Paradox: How Delaying Decisions Can Make Better Cars Faster , 1995 .

[19]  Paul S. Adler,et al.  Designed for Learning: A Tale of Two Auto Plants , 2007 .

[20]  P. Adler Time-and-motion regained , 1993 .

[21]  A. Markowitz Unexpected Hypoglycemia in a Critically Ill Patient , 2002 .

[22]  L. Kohn,et al.  To Err Is Human : Building a Safer Health System , 2007 .

[23]  Erik Jan Hultink,et al.  4 Product development performance: strategy, organization and management in the world auto industry☆ , 1994 .

[24]  John Paul MacDuffie,et al.  The Road to Root Cause: Shop-Floor Problem-Solving at Three Auto Assembly Plants , 1997 .

[25]  D. Bates,et al.  Relationship between medication errors and adverse drug events , 1995, Journal of General Internal Medicine.