Lessons from the War on Cancer: The Need for Basic Research on Safety

We have reached the fifth anniversary of the IOM’s report To Err is Human. It was on the heels of that report that a previous Administration declared War on Error and announced its goal of a 50% reduction in medical error within 5 years. Those years have passed. With the clock running out, there is very little evidence that accidents in healthcare have decreased substantially, in either frequency or severity; and there are no technological, organizational, or political changes in the offing that promise to transform the landscape of patient safety in the next 100 days. Few observers now predict that the ambitious goal of 50% reduction will be achieved. There are parallels to be drawn between the our experience thus far in the war on error and that other healthcare war, the War on Cancer that President Nixon began in 1971. In both cases the goal seemed to be well within reach of a nation confident in its technical and managerial capacity and proud of its ability to handle technical problems. It seemed likely, in 1971, that cancer would surely succumb to the forces that had recently landed humans on the moon—if only those forces could be brought to bear. There were reasons for optimism. Surgery and radiation therapy seemed promising. We were poised for a great leap forward. It was just a matter of working out the details. There was an expectation that progress would be swift, maybe a decade or so, just like the space program. There were some surprises along the way. The War on Cancer has lasted longer than the Thirty Years War. The initial $100 million budgeted by President Nixon has expanded to around $3 billion per year. More importantly, sustained progress on cancer has come through basic research on the mechanisms of cell metabolism, transcription and processing of its DNA, and the control of the cell cycle, and not via improvements in the techniques and applications already established in 1971. Indeed, arguably, the most remarkable success of the War on Cancer has been the willingness of the Congress and Administration to sustain basic research through 4 decades to bring President Nixon’s original goals at least within our reach, if not within our grasp. In the heady, early days of the patient safety movement, the possibility of rapid progress was an article of faith. Characterizations of the state of healthcare safety were, to put it mildly, hyperbolic. Vivid language describing death and mayhem was used in an unabashed campaign for the attention of the body politic. Along with capturing political attention, the language gave rise to the impression that quick progress was available. Much was made of healthcare s poor record in comparison with what were regarded as highly-reliable endeavors, such as commercial aviation and space mission operations. The explicit message was that healthcare lagged far behind other industries and, this being so, that rapid progress on safety was readily available. We could skip the basic research and import lessons learned in the form of applications that could be exploited quickly to make patient safety much better. The ‘‘low hanging fruit’’ included—among others—new information technology to forestall human error, development of standards, and the creation of a safety ‘‘culture’’ for healthcare. Harvesting this fruit has proven considerably more difficult, time consuming, and expensive than expected. More importantly, the experiments—all the efforts to date are experiments—conducted with technology and organizations have not provided much insight into patient safety itself. These efforts have been applications rather than explorations. Applications do not provide much insight into basic mechanisms.When they work, we are not sure why; when they fail, we cannot tell how. If progress on patient safety is simply a matter of putting in place a few bits of technical and organizational machinery, this doesn’t matter much. If we find that the low hanging fruit is not really low or not really fruit, it will matter a great deal. We presently lack either the tools or the models needed to understand what it is that we are actually doing as we pursue safety, and this is true at both the sharp and the blunt ends of the system. For basic research we have tended to rely on other domains, hoping that the lessons learned there can be imported directly and painlessly into healthcare. But healthcare is quantitatively and qualitatively different than commercial aviation or manned spaceflight. It is not manufacturing nor telecommunications nor nuclear power. In scale it dwarfs every other organized endeavor in our society. In complexity it is orders of magnitude more heterogeneous in character than the other socio-technical activities with which it is compared. Many analogies have been made between healthcare and other industries. Although analogies may be a good place to start an inquiry into safety, they are a bad place to end it. Very little time, money, or attention has been directed at systematically pursuing basic research on the subjects that bear on patient safety. Now would be a good time to change that. What is basic research on safety? I believe that it is research in 3 broad areas: First, basic research on safety is the study of the factors that influence each element in Rasmussen’s masterful Figure 1, perhaps the single most important figure we have to guide our research on safety. To understand safety we must come to grips with the fact that failure-free performance is not a fixed From Cognitive Technologies Laboratory, University of Chicago, Chicago, Illinois. Correspondence: Richard I. Cook, MD, 5841 S. Maryland Ave., MC4028, Chicago, IL 60637 (e-mail: ri-cook@uchicago.edu). Copyright 2005 by Lippincott Williams & Wilkins