Evaluating the teaching of evidence-based medicine.

AN INCREASING NUMBER OF MEDICAL SCHOOLS AND residency programs are instituting curricula for teaching the principles and practice of evidencebased medicine (EBM). For example, 95% of US internal medicine residency programs have journal clubs and 37% of US and Canadian internal medicine residencies have time dedicated for EBM. Curricula based on EBM are increasingly popular in residency programs in other specialties, including family medicine, pediatrics, obstetrics/ gynecology, and surgery. Despite the widespread teaching of EBM, however, most of what is known about the outcomes of evidence-based curricula relies on observational data. Although evaluation of the quality of research evidence is a core competency of EBM, the quantity and quality of the evidence for effectively teaching EBM are poor. Ironically, if one were to develop guidelines for how to teach EBM based on these results, they would be based on the lowest level of evidence. There are several reasons why the quality of the evidence for teaching EBM is so weak. Many of these problems are related to the limitations in educational research in general. First, quantitative research methods may be inadequate to capture the complexity of an educational system. Second, students and residents change frequently, making it difficult to retain a consistent sample. Third, the time allotted for a given intervention may be brief in the context of the overall medical curriculum. Fourth, educational institutions may be hesitant to pay students as research participants or to allocate them to unproved educational interventions. Fifth, because most educational interventions are unique to specific institutions, assessment of their effectiveness is usually limited by small sample sizes. Furthermore, even if such interventions could be instituted across multiple institutions, the problems of standardization and cointervention would be particularly challenging. Sixth, perhaps because they are simplest to measure, the most frequently reported outcomes are subjective variables such as satisfaction or self-reported changes in attitudes or knowledge, rather than more important assessments of objectively measured clinical skills or improved patient outcomes. Finally, granting agencies do not give priority to educational investigations, making it difficult to undertake definitive multicenter studies. Educators who have struggled to evaluate educational interventions will find these issues all too familiar. With the increasing prevalence of EBM teaching, however, highquality evidence is more important than ever. Assessment of EBM teaching has also presented some unique problems. For instance, we originally defined evidence-based practice in terms of 4 basic competencies: (1) recognition of a patient problem and construction of a structured clinical question; (2) ability to efficiently and effectively search the medical literature to retrieve the best available evidence to answer the clinical question; (3) critical appraisal of the evidence; and (4) integration of the evidence with all aspects of individual patient decision making to determine the best clinical care for the patient. Although these 4 skills were the most commonly reported curricular objectives in 99 internal medicine residencies that teach EBM, almost all the research on EBM education has focused exclusively on the third item: teaching critical appraisal skills. Examining this literature may yield useful insights into the difficulties of EBM educational research. Since critical appraisal skills involve the ability to differentiate strong from weak research methods, one might expect that this research would be of relatively high quality. In fact, most of these studies are methodologically weak. Using broad criteria to identify any reports of a graduate (residency) EBM curricula, Green identified 18 reports published between 1980 and 1997. Of these, 72% used a traditional journal club format to teach critical appraisal skills. Only 7 of the 18 studies evaluated the effectiveness of their intervention. Five of these 7 studies compared intervention with control (only 1 with randomized design); only 2 of 7 studies used any blinding. Of these 5 controlled studies, 2 used a validated outcome measure to evaluate critical appraisal skills. Measurement of behavioral change relied on self-report in all 5 studies, and none examined patient outcomes. Most reports did not evaluate their intervention

[1]  P. Ellis,et al.  Impact of an evidence-based medicine curriculum based on adult learning theory , 1997, Journal of general internal medicine.

[2]  E. Murray Challenges in educational research , 2002, Medical education.

[3]  R Mears,et al.  A systematic review of the effectiveness of critical appraisal skills training for clinicians , 2000, Medical education.

[4]  A. Detsky,et al.  Evidence-based medicine. A new approach to teaching the practice of medicine. , 1992, JAMA.

[5]  D. Sackett,et al.  A controlled trial of teaching critical appraisal of the clinical literature to medical students. , 1987, JAMA.

[6]  J. Pilcher,et al.  Effects of sleep deprivation on performance: a meta-analysis. , 1996, Sleep.

[7]  D. Grimes Introducing evidence-based medicine into a department of obstetrics and gynecology. , 1995, Obstetrics and gynecology.

[8]  M. Koslowsky,et al.  Meta-analysis of the relationship between total sleep deprivation and performance. , 1992, Chronobiology international.

[9]  M. Green,et al.  Graduate medical education training in clinical epidemiology, critical appraisal, and evidence-based medicine: a critical review of curricula. , 1999, Academic medicine : journal of the Association of American Medical Colleges.

[10]  L. Leung,et al.  Sleep deprivation and house staff performance. Update 1984-1991. , 1992, Journal of occupational medicine. : official publication of the Industrial Medical Association.

[11]  P. Ewings,et al.  Development and validation of a questionnaire to evaluate the effectiveness of evidence‐based practice teaching , 2001, Medical education.

[12]  E. DeLong,et al.  Impact of a medical journal club on house-staff reading habits, knowledge, and critical appraisal skills. A randomized control trial. , 1988, JAMA.

[13]  P. Sandercock,et al.  Framework for design and evaluation of complex interventions to improve health , 2000, BMJ : British Medical Journal.

[14]  Michael L. Green Evidence-based medicine training in internal medicine residency programs , 2000 .

[15]  J. Sidorov,et al.  How are internal medicine residency journal clubs organized, and what makes them successful? , 1995, Archives of internal medicine.

[16]  A E Dobbie,et al.  What evidence supports teaching evidence-based medicine? , 2000, Academic medicine : journal of the Association of American Medical Colleges.

[17]  P. O'Sullivan,et al.  Evaluating medical residents' literature‐appraisal skills , 1995, Academic medicine : journal of the Association of American Medical Colleges.

[18]  D A Asch,et al.  The Libby Zion case. One step forward or two steps backward? , 1988, The New England journal of medicine.

[19]  S. Ancoli-Israel,et al.  Sleep deprivation and clinical performance. , 2002, JAMA.

[20]  G R Norman,et al.  Effectiveness of instruction in critical appraisal (evidence-based medicine) skills: a critical appraisal. , 1998, CMAJ : Canadian Medical Association journal = journal de l'Association medicale canadienne.

[21]  R. Bogdan Qualitative research for education : an introduction to theory and methods / by Robert C. Bogdan and Sari Knopp Biklen , 1997 .