Comparative Effectiveness of Technology-Enhanced Simulation Versus Other Instructional Methods: A Systematic Review and Meta-Analysis

Abstract To determine the comparative effectiveness of technology-enhanced simulation, we summarized the results of studies comparing technology-enhanced simulation training with nonsimulation instruction for health professions learners. We systematically searched databases including MEDLINE, Embase, and Scopus through May 2011 for relevant articles. Working in duplicate, we abstracted information on instructional design, outcomes, and study quality. From 10,903 candidate articles, we identified 92 eligible studies. In random-effects meta-analysis, pooled effect sizes (positive numbers favoring simulation) were as follows: satisfaction outcomes, 0.59 (95% confidence interval, 0.36–0.81; n = 20 studies); knowledge, 0.30 (0.16–0.43; n = 42); time measure of skills, 0.33 (0.00–0.66; n = 14); process measure of skills, 0.38 (0.24–0.52; n = 51); product measure of skills, 0.66 (0.30–1.02; n = 11); time measure of behavior, 0.56 (−0.07 to 1.18; n = 7); process measure of behavior, 0.77 (−0.13 to 1.66; n = 11); and patient effects, 0.36 (−0.06 to 0.78; n = 9). For 5 studies reporting comparative costs, simulation was more expensive and more effective. In summary, in comparison with other instruction, technology-enhanced simulation is associated with small to moderate positive effects.

[1]  A. Ziv,et al.  Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review , 2005, Medical teacher.

[2]  J. Sweller,et al.  Cognitive Load Theory and Complex Learning: Recent Developments and Future Directions , 2005 .

[3]  David A Cook,et al.  One drop at a time: research to advance the science of simulation. , 2010, Simulation in healthcare : journal of the Society for Simulation in Healthcare.

[4]  Diana H J M Dolmans,et al.  What Do We Know About Cognitive and Motivational Effects of Small Group Tutorials in Problem-Based Learning? , 2006, Advances in health sciences education : theory and practice.

[5]  D. Moher,et al.  Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. , 2010, International journal of surgery.

[6]  Ingram Olkin,et al.  Adjusting for publication bias in the presence of heterogeneity , 2003, Statistics in medicine.

[7]  David A. Cook,et al.  Predictive Validity Evidence for Medical Education Research Study Quality Instrument Scores: Quality of Submissions to JGIM’s Medical Education Special Issue , 2008, Journal of General Internal Medicine.

[8]  G. Maddern,et al.  Surgical Simulation: A Systematic Review , 2006, Annals of surgery.

[9]  Keith Postlethwaite,et al.  Simulation in clinical learning , 2003, Medical education.

[10]  Geoff Norman,et al.  Relative effectiveness of high‐ versus low‐fidelity simulation in learning heart sounds , 2009, Medical education.

[11]  D. Cook,et al.  Internet-based learning in the health professions: a meta-analysis. , 2008, JAMA.

[12]  Pisake Lumbiganon,et al.  The effectiveness of model-based training in accelerating IUD skill acquisition. A study of midwives in Thailand. , 1997 .

[13]  M. Richardson,et al.  Cost analysis and feasibility of high-fidelity simulation based radiology contrast reaction curriculum. , 2011, Academic radiology.

[14]  M. Roizen,et al.  Technology-enhanced simulation for health professions education: a systematic review and meta-analysis , 2012 .

[15]  G. Smith,et al.  Bias in meta-analysis detected by a simple, graphical test , 1997, BMJ.

[16]  Matthew B Weinger,et al.  The Pharmacology of Simulation: A Conceptual Framework to Inform Progress in Simulation Research , 2010, Simulation in healthcare : journal of the Society for Simulation in Healthcare.

[17]  G H Guyatt,et al.  Publication bias: a brief review for clinicians. , 2000, Mayo Clinic proceedings.

[18]  Jacob Cohen Statistical Power Analysis for the Behavioral Sciences , 1969, The SAGE Encyclopedia of Research Design.

[19]  D. Altman,et al.  Measuring inconsistency in meta-analyses , 2003, BMJ : British Medical Journal.

[20]  M Abbey,et al.  In Situ Simulation-based Team Training for Post-cardiac Surgical Emergency Chest Reopen in the Intensive Care Unit , 2009, Anaesthesia and intensive care.

[21]  D. Cook,et al.  Virtual patients: a critical literature review and proposed next steps , 2009, Medical education.

[22]  D. Cook The Research We Still Are Not Doing: An Agenda for the Study of Computer-Based Learning , 2005, Academic medicine : journal of the Association of American Medical Colleges.

[23]  David A Cook,et al.  Time and learning efficiency in Internet-based learning: a systematic review and meta-analysis , 2010, Advances in health sciences education : theory and practice.

[24]  P. Tugwell,et al.  The Newcastle-Ottawa Scale (NOS) for Assessing the Quality of Nonrandomised Studies in Meta-Analyses , 2014 .

[25]  M. Strehlow,et al.  Evaluating the efficacy of simulators and multimedia for refreshing ACLS skills in India. , 2010, Resuscitation.

[26]  David A Cook,et al.  Association between funding and quality of published medical education research. , 2007, JAMA.

[27]  David A Cook,et al.  Reflections on experimental research in medical education , 2010, Advances in health sciences education : theory and practice.

[28]  Philipp Dahm,et al.  Evidence‐based urology in practice: when to believe a subgroup analysis? , 2010, BJU international.

[29]  G. Maddern,et al.  A Systematic Review of Skills Transfer After Surgical Simulation Training , 2008, Annals of surgery.

[30]  Donald L. Kirkpatrick,et al.  Great Ideas Revisited. Techniques for Evaluating Training Programs. Revisiting Kirkpatrick's Four-Level Model. , 1996 .

[31]  D. Cook,et al.  Computerized Virtual Patients in Health Professions Education: A Systematic Review and Meta-Analysis , 2010, Academic medicine : journal of the Association of American Medical Colleges.

[32]  W. McGaghie,et al.  What is feedback in clinical education? , 2008, Medical education.

[33]  A. Ziv,et al.  Simulation-based medical education: an ethical imperative. , 2006, Simulation in healthcare : journal of the Society for Simulation in Healthcare.

[34]  David A Cook,et al.  Cost: the missing outcome in simulation-based medical education research: a systematic review. , 2013, Surgery.

[35]  H. Schmidt,et al.  Description, justification and clarification: a framework for classifying the purposes of research in medical education , 2008, Medical education.

[36]  W. McGaghie,et al.  A critical review of simulation‐based medical education research: 2003–2009 , 2010, Medical education.

[37]  J. Barsuk,et al.  Does Simulation-Based Medical Education With Deliberate Practice Yield Better Results Than Traditional Clinical Education? A Meta-Analytic Comparative Review of the Evidence , 2011, Academic medicine : journal of the Association of American Medical Colleges.

[38]  R. Aggarwal,et al.  Systematic review of randomized controlled trials on the effectiveness of virtual reality training for laparoscopic surgery , 2008, The British journal of surgery.

[39]  C P Friedman,et al.  The research we should be doing , 1994, Academic medicine : journal of the Association of American Medical Colleges.

[40]  I. Olkin,et al.  The case of the misleading funnel plot , 2006, BMJ : British Medical Journal.