Factors that influence line managers' perceptions of hospital performance data.

OBJECTIVE To design and test a model of the factors that influence frontline and midlevel managers' perceptions of usefulness of comparative reports of hospital performance. STUDY SETTING A total of 344 frontline and midlevel managers with responsibility for stroke and medical cardiac patients in 89 acute care hospitals in the Canadian province of Ontario. STUDY DESIGN Fifty-nine percent of managers responded to a mail survey regarding managers' familiarity with a comparative report of hospital performance, ratings of the report's data quality, relevance and complexity, improvement culture of the organization, and perceptions of usefulness of the report. EXTRACTION METHODS Exploratory factor analysis was performed to assess the dimensionality of performance data characteristics and improvement culture. Antecedents of perceived usefulness and the role of improvement culture as a moderator were tested using hierarchical regression analyses. PRINCIPAL FINDINGS Both data characteristics variables including data quality, relevance, and report complexity, as well as organizational factors including dissemination intensity and improvement culture, explain significant amounts of variance in perceptions of usefulness of comparative reports of hospital performance. The total R2 for the full hierarchical regression model = .691. Improvement culture moderates the relationship between data relevance and perceived usefulness. CONCLUSIONS Organizations and those who fund and design performance reports need to recognize that both report characteristics and organizational context play an important role in determining line managers' response to and ability to use these types of data.

[1]  W. Powell,et al.  The iron cage revisited institutional isomorphism and collective rationality in organizational fields , 1983 .

[2]  Herman Aguinis,et al.  Statistical Power Computations for Detecting Dichotomous Moderator Variables with Moderated Multiple Regression , 1998 .

[3]  E. Hannan,et al.  Public release of cardiac surgery outcomes data in New York: what do New York state cardiologists think of it? , 1997, American heart journal.

[4]  H. Kaiser The Application of Electronic Computers to Factor Analysis , 1960 .

[5]  J. Hackman,et al.  Total Quality Management: Empirical, Conceptual, and Practical Issues , 1995 .

[6]  E. K. Wicks,et al.  Making report cards work. , 1999, Health affairs.

[7]  D M Eddy,et al.  Performance measurement: problems and solutions. , 1998, Health affairs.

[8]  Use of outcome data by purchasers and consumers: new strategies and new dilemmas. , 1998, International journal for quality in health care : journal of the International Society for Quality in Health Care.

[9]  Russell L. Stogsdill,et al.  A regional intervention to improve the hospital mortality associated with coronary artery bypass graft surgery. The Northern New England Cardiovascular Disease Study Group. , 1996, JAMA.

[10]  R H Brook,et al.  The public release of performance data: what do we expect to gain? A review of the evidence. , 2000, JAMA.

[11]  Gregory B. Northcraft,et al.  The preservation of self in everyday life: The effects of performance expectations and feedback context on feedback inquiry , 1990 .

[12]  James B. Thomas,et al.  INFORMATION PROCESSING IN STRATEGIC ALLIANCE BUILDING: A MULTIPLE‐CASE APPROACH* , 1993 .

[13]  P. C. Nutt How Decision Makers Evaluate Alternatives and the Influence of Complexity , 1998 .

[14]  Palmer Rh,et al.  Process-based measures of quality: the need for detailed clinical data in large health care databases. , 1997 .

[15]  D B Nash,et al.  How Pennsylvania hospitals have responded to publicly released reports on coronary artery bypass graft surgery. , 1998, The Joint Commission journal on quality improvement.

[16]  Michael Huberman,et al.  Research utilization: The state of the art , 1994 .

[17]  M. Evans A Monte Carlo study of the effects of correlated method variance in moderated multiple regression analysis , 1985 .

[18]  A. Tversky,et al.  Judgment under Uncertainty: Heuristics and Biases , 1974, Science.

[19]  B M Jennings,et al.  A provocative look at performance measurement. , 1999, Nursing administration quarterly.

[20]  R. Cattell The Scree Test For The Number Of Factors. , 1966, Multivariate behavioral research.

[21]  C. K. Barsukiewicz,et al.  Electronic medical records: are physicians ready? , 1999, Journal of healthcare management / American College of Healthcare Executives.

[22]  S. Leatherman,et al.  The National Quality Forum: a 'me-too' or a breakthrough in quality measurement and reporting? , 1999, Health affairs.

[23]  John W. Meyer,et al.  Institutionalized Organizations: Formal Structure as Myth and Ceremony , 1977, American Journal of Sociology.

[24]  G. Rogers,et al.  Methodology matters. Reporting comparative results from hospital patient surveys , 1999 .

[25]  J. Thomas Report cards--useful to whom and for what? , 1998, The Joint Commission journal on quality improvement.

[26]  Fred D. Davis Perceived Usefulness, Perceived Ease of Use, and User Acceptance of Information Technology , 1989, MIS Q..

[27]  D. O'leary Measurement and accountability: taking careful aim. , 1995, The Joint Commission journal on quality improvement.

[28]  D. Eddy,et al.  Enhancing performance measurement: NCQA's road map for a health information framework. National Committee for Quality Assurance. , 1999, JAMA.

[29]  M. Donaldson,et al.  Measuring the quality of health care: state of the art. , 1997, The Joint Commission journal on quality improvement.

[30]  R. Stewart Longman,et al.  Interpolating 95Th Percentile Eigenvalues from Random Data: An Empirical Example , 1993 .

[31]  P B Batalden,et al.  Report cards or instrument panels: who needs what? , 1995, The Joint Commission journal on quality improvement.

[32]  C. Maxwell Public disclosure of performance information in Pennsylvania: impact on hospital charges and the views of hospital executives. , 1998, The Joint Commission journal on quality improvement.

[33]  A M Epstein,et al.  Influence of cardiac-surgery performance reports on referral practices and access to care. A survey of cardiovascular specialists. , 1996, The New England journal of medicine.

[34]  D. Berwick,et al.  Hospital leaders' opinions of the HCFA mortality data. , 1990, JAMA.

[35]  D. Longo,et al.  Consumer reports in health care. Do they make a difference in patient care? , 1997, JAMA.

[36]  Liane Ginsburg Total quality management in health care: A goal setting approach , 2001 .

[37]  G Mosser,et al.  The three faces of performance measurement: improvement, accountability, and research. , 1997, The Joint Commission journal on quality improvement.

[38]  L. E. Jones,et al.  Analysis of multiplicative combination rules when the causal variables are measured with error. , 1983 .

[39]  P. Romano,et al.  The California Hospital Outcomes Project: how useful is California's report card for quality improvement? , 1998, The Joint Commission journal on quality improvement.

[40]  C. Weiss,et al.  Social scientists and decision makers look at the usefulness of mental health research. , 1981, The American psychologist.

[41]  P. Romano,et al.  Grading the graders: how hospitals in California and New York perceive and interpret their report cards. , 1999, Medical care.