“Safeware”: Safety-Critical Computing and Health Care Information Technology

Information technology (IT) is highly promoted as a mechanism for advancing safety in health care. Ironically, little attention has been paid to the issues of safety in health care IT. Computer scientists have extensively studied the problem of assured performance in safety-critical computing systems. They have developed a conceptual approach and set of techniques for use in settings where incorrect or aberrant operation (or results from correct operation that are aberrant in context) might endanger users, the public, or the environment. However, these methods are not commonly used in health care IT, which generally has been developed without specific consideration of the special factors and unique requirements for safe operations. This article provides a brief introduction for health care professionals and informaticians to what has been called “safeware,” a comprehensive approach to hazard analysis, design, operation, and maintenance of both hardware and software systems. This approach considers the entire joint sociotechnical system (including its operators) over its entire lifecycle, from conception through operation and on to decommissioning. Adoption of safeware methods should enhance the trustworthiness of future health IT. Introduction Twenty-five years ago, Lissane Bainbridge coined the phrase “ironies of automation” to refer to the observation that introducing automation into a complex sociotechnical system to improve safety and performance often simultaneously introduced new problems into the system that degraded safety and performance. 2 Despite this experience, the belief that advanced information technology (IT) is a critical mechanism by which to improve the safety of health care is strongly held by academics, public officials, and vendor, business, and civic groups. 4, 5, 6, 7, 8, 9 The anticipated benefits of health care IT are presented in these discussions as a sort of manifest destiny—difficult, to be sure, but ultimately inevitable. While there have been many discussions about the challenges, costs, priorities, and other planning issues implementing IT, there has been virtually no discussion about how to make health IT itself safe for patients, practitioners, and health care organizations. The irony of seeking safety through systems that may not be safe to begin with seems to have been lost in the enthusiasm for remaking health care via IT. in Past experience with IT has not shown it to be an unequivocal success. 14 Hardware failures have propagated in unexpected ways to remote, ostensibly unrelated components on a common network ; system upgrades have lead to missing or false laboratory information; programming mistakes have similarly led to incorrect guidance in decision support; and

[1]  John A. McDermid,et al.  A systematic approach to safety case maintenance , 1999, Reliab. Eng. Syst. Saf..

[2]  Reed M. Gardner,et al.  Position Paper: Recommendations for Responsible Monitoring and Regulation of Clinical Software Systems , 1997, J. Am. Medical Informatics Assoc..

[3]  Gilad J. Kuperman,et al.  Case Report: Comprehensive Analysis of a Medication Dosing Error Related to CPOE , 2005, J. Am. Medical Informatics Assoc..

[4]  Rainu Kaushal,et al.  Defining the Priorities and Challenges for the Adoption of Information Technology in HealthCare: Opinions from an Expert Panel , 2003, AMIA.

[5]  S. Rosenfeld,et al.  Medicare's next voyage: encouraging physicians to adopt health information technology. , 2005, Health affairs.

[6]  Neil R. Storey,et al.  Safety-critical computer systems , 1996 .

[7]  Nancy G Leveson,et al.  Software safety: why, what, and how , 1986, CSUR.

[8]  Y. Han,et al.  Unexpected Increased Mortality After Implementation of a Commercially Sold Computerized Physician Order Entry System , 2005, Pediatrics.

[9]  Gilad J. Kuperman,et al.  Synthesis of Research Paper: A Consensus Statement on Considerations for a Successful CPOE Implementation , 2003, J. Am. Medical Informatics Assoc..

[10]  Robert L. Wears,et al.  Automation, interaction, complexity, and failure: A case study , 2006, Reliab. Eng. Syst. Saf..

[11]  Peter G. Neumann,et al.  Computer-related risks , 1994 .

[12]  Robert L. Wears,et al.  The Role of Automation in Complex System Failures , 2005 .

[13]  D. Norman,et al.  New technology and human error , 1989 .

[14]  Nancy G. Leveson,et al.  A systems-theoretic approach to safety in software-intensive systems , 2004, IEEE Transactions on Dependable and Secure Computing.

[15]  J. Feinglass,et al.  The epidemiology of prescribing errors: the potential impact of computerized prescriber order entry. , 2004, Archives of internal medicine.

[16]  D. Bates,et al.  The Costs of a National Health Information Network , 2005, Annals of Internal Medicine.

[17]  Arthur L. Norberg,et al.  Trapped in the net: The unanticipated consequences of computerization , 1998 .

[18]  C. W. Johnson,et al.  Why did that happen? Exploring the proliferation of barely usable software in healthcare systems , 2006, Quality and Safety in Health Care.

[19]  Ian Sommerville,et al.  Trust in Technology: A Socio-Technical Perspective , 2006, Computer Supported Cooperative Work.

[20]  L. Bainbridge Ironies of Automation , 1982 .

[21]  J. Westbrook,et al.  Should clinical software be regulated? , 2006, The Medical journal of Australia.

[22]  Peter Neumann,et al.  Safeware: System Safety and Computers , 1995, SOEN.

[23]  James T. Reason,et al.  Managing the risks of organizational accidents , 1997 .

[24]  A. Localio,et al.  Role of computerized physician order entry systems in facilitating medication errors. , 2005, JAMA.

[25]  F. Girosi,et al.  Promoting health information technology: is there a case for more-aggressive government action? , 2005, Health affairs.