Getting off the "gold standard": randomized controlled trials and education research.

While useful in some situations, randomization is not the “gold standard” for medical education research. More important is that decisions regarding methodology precede the intervention, that adequate numbers of subjects and iterations are used, that a comparison group is included, and that limitations are addressed in a thoughtful, thorough manner. In addition, the literature demonstrates quite definitively that medical learners will learn whatever we teach and also may supplement any teaching deficits to meet certification requirements.18 Thus, to increase our understanding of effective educational interventions, a new educational intervention should be compared to another effective intervention. Unlike clinical research, a placebo arm is rarely helpful. Comparing the new educational intervention to “usual” practices is productive as long as students are not “cued” to the novelty of the research arm—which may enhance (or negatively bias) their learning—and the usual practices are well described. Whether randomized or nonrandomized, medical education studies must carefully analyze sources of bias—unforeseen confounding variables—to explain the observed results and ensure that subsequent researchers will find these results reproducible. Rather than an obligatory listing of these potential sources of error in the discussion, researchers will enhance existing knowledge through a careful and detailed analysis of sources of bias that may have affected the results.3 Equally important, research designs must include a control group, concurrent if possible, to ensure equal likelihood of exposure to nonintervention events that could bias the results. If educators begin to work together, more collaborative, multi-institutional projects, perhaps akin to “pragmatic trials,”12 may be produced in the future. This is likely to add substantially to our understanding of effective resident education.

[1]  David A Cook,et al.  Reflections on experimental research in medical education , 2010, Advances in health sciences education : theory and practice.

[2]  David A. Cook,et al.  Predictive Validity Evidence for Medical Education Research Study Quality Instrument Scores: Quality of Submissions to JGIM’s Medical Education Special Issue , 2008, Journal of General Internal Medicine.

[3]  William C. McGaghie,et al.  The Reputation of Medical Education Research: Quasi-Experimentation and Unresolved Threats to Validity , 2008, Teaching and learning in medicine.

[4]  J. Ware,et al.  Pragmatic trials--guides to better patient care? , 2011, The New England journal of medicine.

[5]  David Prideaux,et al.  Researching the outcomes of educational interventions: a matter of design , 2002, BMJ : British Medical Journal.

[6]  K. Eva Broadening the debate about quality in medical education research , 2009, Medical education.

[7]  L. Gruppen Improving Medical Education Research , 2007, Teaching and learning in medicine.

[8]  I. R. Hart,et al.  BEME Guide No. 1: Best Evidence Medical Education. , 1999, Medical teacher.

[9]  G. Norman,et al.  Randomized controlled trials. , 2004, AJR. American journal of roentgenology.

[10]  David W Nierenberg,et al.  Educational epidemiology: applying population-based design and analytic approaches to study medical education. , 2004, JAMA.

[11]  K. Mann,et al.  A RIME Perspective on the Quality and Relevance of Current and Future Medical Education Research , 2004, Academic medicine : journal of the Association of American Medical Colleges.

[12]  G. Norman Is experimental research passé , 2010, Advances in health sciences education : theory and practice.

[13]  Glenn Regehr,et al.  It’s NOT rocket science: rethinking our metaphors for research in health professions education , 2010, Medical education.

[14]  Olle ten Cate,et al.  What happens to the student? The neglected variable in educational outcome research. , 2001 .

[15]  D. Berwick,et al.  The Science of Safety Improvement: Learning While Doing , 2011, Annals of Internal Medicine.

[16]  G. Norman RCT = results confounded and trivial: the perils of grand educational experiments , 2003, Medical education.

[17]  D. Berliner Comment: Educational Research:The Hardest Science of All , 2002 .