Comparison of Quality of Care for Patients in the Veterans Health Administration and Patients in a National Sample

As methods for measuring the quality of medical care have matured, widespread quality problems have become increasingly evident (1, 2). The solution to these problems is much less obvious, however, particularly with regard to large delivery systems. Many observers have suggested that improved information systems, systematic performance monitoring, and coordination of care are necessary to enhance the quality of medical care (3). Although the use of integrated information systems (including electronic medical records) and performance indicators has become more common throughout the U.S. health care system, most providers are not part of a larger integrated delivery system and continue to rely on traditional information systems (4). An exception is the Veterans Health Administration (VHA). As the largest delivery system in the United States, the VHA has been recognized as a leader in developing a more coordinated system of care. Beginning in the early 1990s, VHA leadership instituted both a sophisticated electronic medical record system and a quality measurement approach that holds regional managers accountable for several processes in preventive care and in the management of common chronic conditions (5, 6). Other changes include a system-wide commitment to quality improvement principles and a partnership between researchers and managers for quality improvement (7). As Jha and colleagues (8) have shown, since these changes have been implemented, VHA performance has outpaced that of Medicare in the specific areas targeted. Nevertheless, whether this improvement has extended beyond the relatively narrow scope of the performance measures is unknown. Beyond that study, the data comparing VHA care with other systems of care are sparse and mixed. For example, patients hospitalized at VHA hospitals were more likely than Medicare patients to receive angiotensin-converting enzyme inhibitors and thrombolysis after myocardial infarction (9). On the other hand, VHA patients were less likely to receive angiography when indicated and had higher mortality rates after coronary artery bypass grafting than patients in community hospitals (10, 11). Kerr and colleagues found that care for diabetes was better in almost every dimension in the VHA system than in commercial managed care (12). More extensive comparisons, especially of outpatient care, are lacking. To address these issues, a more comprehensive assessment of quality is needed. Using a broad measure of quality of care that is based on medical record review and was developed outside the VHA, we compared the quality of outpatient and inpatient care among 2 samples: 1) a national sample of patients drawn from 12 communities and 2) VHA patients from 26 facilities in 12 health care systems located in the southwestern and midwestern United States (13). We analyzed performance in the years after the institution of routine performance measurement and the electronic medical record. Using the extensive set of quality indicators included in the measurement system, we compared the overall quality of care delivered in the VHA system and in the United States, as well as the quality of acute, chronic, and preventive care across 26 conditions. In addition, we evaluated whether VHA performance was better in the specific areas targeted by the VHA quality management system. Methods Development of Quality Indicators For this study, we used quality indicators from RAND's Quality Assessment Tools system, which is described in more detail elsewhere (14-17). The indicators included in the Quality Assessment Tools system are process quality measures, are more readily actionable than outcomes measures, require less risk adjustment, and follow the structure of national guidelines (18, 19). After reviewing established national guidelines and the medical literature, we chose a subset of quality indicators from the Quality Assessment Tools system that represented the spectrum of outpatient and inpatient care (that is, screening, diagnosis, treatment, and follow-up) for acute and chronic conditions and preventive care processes representing the leading causes of morbidity, death, and health care use among older male patients. The Appendix Table lists the full indicator set, which was determined by four 9-member, multispecialty expert panels. These panels assessed the validity of the proposed indicators using the RAND/University of California, Los Angelesmodified Delphi method. The experts rated the indicators on a 9-point scale (1 = not valid; 9 = very valid), and we accepted indicators that had a median validity score of 7 or higher. This method of selecting indicators is reliable and has been shown to have content, construct, and predictive validity (20-23). Of the 439 indicators in the Quality Assessment Tools system, we included 348 indicators across 26 conditions in our study and excluded 91 indicators that were unrelated to the target population (for example, those related to prenatal care and cesarean sections). Of the 348 indicators, 21 were indicators of overuse (for example, patients with moderate to severe asthma should not receive -blocker medications) and 327 were indicators of underuse (for example, patients who have been hospitalized for heart failure should have follow-up contact within 4 weeks of discharge). Appendix Table. Comparison of Performance of the Veterans Health Administration Sample and the National Sample by Indicator Two physicians independently classified each indicator according to the type of care delivered; the function of the indicated care (screening, diagnosis, treatment, and follow-up); and whether the indicator was supported by a randomized, controlled trial, another type of controlled trial, or other evidence. Type of care was classified as acute (for example, in patients presenting with dysuria, presence or absence of fever and flank pain should be elicited), chronic (for example, patients with type 2 diabetes mellitus in whom dietary therapy has failed should receive oral hypoglycemic therapy), or preventive (for example, all patients should be screened for problem drinking). In addition, we further classified the indicators into 3 mutually exclusive categories according to whether they corresponded to the VHA performance indicators that were in use in fiscal year 1999. Twenty-six indicators closely matched the VHA indicators, 152 involved conditions that were targeted by the VHA indicators but were not among the 26 matches, and 170 did not match the VHA measures or conditions. We performed a similar process to produce a list of 15 indicators that matched contemporaneous Health Plan Employer Data and Information Set (HEDIS) performance measures (24). Table 1 shows the conditions targeted by the indicators, and Table 2 gives an example indicator for each of the conditions or types of care for which condition- or type-specific comparisons were possible. Table 1. Conditions and Number of Indicators Used in Comparisons Table 2. Example Indicators of Quality of Care Identifying Participants Patients were drawn from 2 ongoing quality-of-care studies: a study of VHA patients and a random sample of adults from 12 communities (13). The VHA patients were drawn from 26 clinical sites in 12 health care systems located in 2 Veterans Integrated Service Networks in the midwestern and southwestern United States. These networks closely match the overall Veterans Affairs system with regard to medical record review and survey-based quality measures (25, 26). We selected patients who had had at least 2 outpatient visits in each of the 2 years between 1 October 1997 and 30 September 1999. A total of 106576 patients met these criteria. We randomly sampled 689, oversampling for chronic obstructive pulmonary disease (COPD), hypertension, and diabetes, and were able to locate records for 664 patients (a record location rate of 96%). Because of resource constraints, we reviewed a random subset of 621 of these records. Since this sample contained only 20 women and 4 patients younger than 35 years of age, we further restricted the sample to men older than 35 years of age. Thus, we included 596 VHA patients in the analysis. All of these patients had complete medical records. The methods we used to obtain the national sample have been described elsewhere (13) and are summarized here. As part of a nationwide study, residents of 12 large metropolitan areas (Boston, Massachusetts; Cleveland, Ohio; Greenville, South Carolina; Indianapolis, Indiana; Lansing, Michigan; Little Rock, Arkansas; Miami, Florida; Newark, New Jersey; Orange County, California; Phoenix, Arizona; Seattle, Washington; and Syracuse, New York) were contacted by using random-digit dialing and were asked to complete a telephone survey (27). To ensure comparability with the VHA sample, we included only men older than 35 years of age. Between October 1998 and August 2000, we telephoned 4086 of these participants and asked for permission to obtain copies of their medical records from all providers (both individual and institutional) that they had visited within the past 2 years. We received verbal consent from 3138 participants (77% of those contacted by telephone). We mailed consent forms and received written permission from 2351 participants (75% of those who had given verbal permission). We received at least 1 medical record for 2075 participants (88% of those who had returned consent forms). We excluded participants who had not had at least 2 medical visits in the past 2 years to further ensure comparability with the VHA sample. Thus, our final national sample included 992 persons. The rolling abstraction period (October 1996 to August 2000) substantially overlapped the VHA sampling period. The average overlap was 70%, and all records had at least 1 year of overlap. Seven hundred eight (71%) of the 992 persons in the national sample had complete medical records. On the basis of data from the original telephone survey, we det

[1]  E. McGlynn,et al.  Quality of Care for Cardiopulmonary Conditions , 2000 .

[2]  A. Rosen,et al.  Eye examinations for VA patients with diabetes: standardizing performance measures. , 2000, International journal for quality in health care : journal of the International Society for Quality in Health Care.

[3]  J. Feussner,et al.  Reinventing VA health care: systematizing quality improvement and quality innovation. , 2000, Medical care.

[4]  Robert Tibshirani,et al.  An Introduction to the Bootstrap , 1994 .

[5]  P. Shekelle,et al.  Assessing the Predictive Validity of the RAND/UCLA Appropriateness Method Criteria for Performing Carotid Endarterectomy , 1998, International Journal of Technology Assessment in Health Care.

[6]  J. Halpern The measurement of quality of care in the Veterans Health Administration. , 1996, Medical care.

[7]  L. Leape,et al.  Comparison of Use of Medications After Acute Myocardial Infarction in the Veterans Health Administration and Medicare , 2001, Circulation.

[8]  P Kemper,et al.  The design of the community tracking study: a longitudinal study of health system change and its effects on people. , 1996, Inquiry : a journal of medical care organization, provision and financing.

[9]  E. McGlynn,et al.  How good is the quality of health care in the United States? , 1998, The Milbank quarterly.

[10]  K W Kizer,et al.  Quality Enhancement Research Initiative (QUERI): A collaboration between research and clinical practice. , 2000, Medical care.

[11]  E. McGlynn,et al.  New approach to assessing clinical quality of care for women: the QA Tool system. , 1999, Women's health issues : official publication of the Jacobs Institute of Women's Health.

[12]  E. Peterson,et al.  Racial variation in cardiac procedure use and survival following acute myocardial infarction in the Department of Veterans Affairs. , 1994, JAMA.

[13]  R. Kravitz,et al.  Measuring the clinical consistency of panelists' appropriateness ratings: the case of coronary artery bypass surgery. , 1997, Health policy.

[14]  N. Every,et al.  Quality enhancement research initiative in ischemic heart disease: A quality initiative from the Department of Veterans Affairs , 2000 .

[15]  J. Luck,et al.  Using standardised patients to measure physicians' practice: validation study using audio recordings , 2002, BMJ : British Medical Journal.

[16]  E. McGlynn,et al.  Quality of Care for General Medical Conditions: A Review of the Literature and Quality Indicators , 2000 .

[17]  K. Kizer The "New VA": A National Laboratory for Health Care Quality Management , 1999, American journal of medical quality : the official journal of the American College of Medical Quality.

[18]  E. McGlynn,et al.  The quality of health care delivered to adults in the United States. , 2003, The New England journal of medicine.

[19]  Janet M. Corrigan,et al.  Priority areas for national action : transforming health care quality , 2003 .

[20]  J. R. Landis,et al.  The measurement of observer agreement for categorical data. , 1977, Biometrics.

[21]  D. Singer,et al.  Processes of care, illness severity, and outcomes in the management of community-acquired pneumonia at academic hospitals. , 2001, Archives of internal medicine.

[22]  E. McGlynn,et al.  Evaluating the quality of cancer care , 2000, Cancer.

[23]  L. Kohn,et al.  COMMITTEE ON QUALITY OF HEALTH CARE IN AMERICA , 2000 .

[24]  A. Jha,et al.  Effect of the transformation of the Veterans Affairs Health Care System on the quality of care. , 2003, The New England journal of medicine.

[25]  S. Fihn Does VA health care measure up? , 2000, The New England journal of medicine.

[26]  R. Mayberry,et al.  Racial and Ethnic Differences in Access to Medical Care , 2000, Medical care research and review : MCRR.

[27]  Carol M Mangione,et al.  Diabetes Care Quality in the Veterans Affairs Health Care System and Commercial Managed Care: The TRIAD Study , 2004, Annals of Internal Medicine.

[28]  W. Rogers Regression standard errors in clustered samples , 1994 .

[29]  E. McGlynn,et al.  Quality of Care for Oncologic Conditions and HIV , 2000 .

[30]  J W Peabody,et al.  Comparison of vignettes, standardized patients, and chart abstraction: a prospective validation study of 3 methods for measuring quality. , 2000, JAMA.

[31]  E. McGlynn,et al.  Profiling the quality of care in twelve communities: results from the CQI study. , 2004, Health affairs.

[32]  E. Hannan,et al.  In-Hospital Mortality Following Coronary Artery Bypass Graft Surgery in Veterans Health Administration and Private Sector Hospitals , 2003, Medical care.

[33]  L. Leape,et al.  Regionalization and the underuse of angiography in the Veterans Affairs Health Care System as compared with a fee-for-service system. , 2003, The New England journal of medicine.

[34]  R. Gliklich,et al.  Using "get with the guidelines" to improve cardiovascular secondary prevention. , 2003, Joint Commission journal on quality and safety.

[35]  J P Kahan,et al.  The reproducibility of a method to identify the overuse and underuse of medical procedures. , 1998, The New England journal of medicine.

[36]  E. McGlynn,et al.  Developing a clinical performance measure. , 1998, American journal of preventive medicine.