The Correlation of Workplace Simulation-Based Assessments With Interns’ Infant Lumbar Puncture Success: A Prospective, Multicenter, Observational Study

Introduction Little data are available to guide supervisors’ decisions regarding when trainees are prepared to safely perform their first procedure on a patient. We aimed to describe the correlation of simulation-based assessments, in the workplace, with interns’ first clinical infant lumbar puncture (ILP) success. Methods This is a prospective, observational subcomponent of a larger study of incoming interns at 33 academic medical centers (July 2010 to June 2012) assessing the impact of just-in-time training. When an intern’s patient required an ILP, a just-in-time simulation-based skills refresher was conducted with his or her supervisor. At the end of the refresher, supervisors assessed interns’ ILP skills on a simulator in the workplace before clinical performance using a four point anchored scale. The primary outcome was the correlation of supervisors’ assessment and interns’ procedural success. The number needed to assess for this instrument (1 / absolute risk reduction) was calculated. Results A total of 1600 interns were eligible to participate, and 1215 were enrolled. A total of 297 completed an assessment and a subsequent clinical ILP. Success rates for each scale rating were 29% (18/63) for novice, 39% (51/130) for beginner, 55% (46/83) for competent, and 43% (9/21) for proficient. The correlation coefficient was 0.161 (95% confidence interval, 0.057–0.265), indicating a weak correlation between supervisor rating and success. Success rate was 53% for the ratings of competent or proficient compared with 35% for the ratings of novice or beginner. Using the global rating scale for the summative assessment to determine procedural readiness could lead to 1 fewer patient experiencing a failed ILP for every 6 interns tested (6.2; 95% confidence interval, 4.0–8.5). Conclusions A simulation-based assessment of interns conducted in the workplace before their first ILP has some value in predicting clinical ILP success.

[1]  James M. Gerard,et al.  Can Residents Assess Other Providers' Infant Lumbar Puncture Skills?: Validity Evidence for a Global Rating Scale and Subcomponent Skills Checklist , 2016, Pediatric emergency care.

[2]  Travis Whitfill,et al.  Impact of Just-in-Time and Just-in-Place Simulation on Intern Success With Infant Lumbar Puncture , 2015, Pediatrics.

[3]  A. Kimia,et al.  The effect of traumatic lumbar puncture on hospitalization rate for febrile infants 28 to 60 days of age. , 2015, Academic emergency medicine : official journal of the Society for Academic Emergency Medicine.

[4]  David A Cook,et al.  Linking Simulation-Based Educational Assessments and Patient-Related Outcomes: A Systematic Review and Meta-Analysis , 2015, Academic medicine : journal of the Association of American Medical Colleges.

[5]  Pavan Zaveri,et al.  A Comprehensive Infant Lumber Puncture Novice Procedural Skills Training Package: An INSPIRE Simulation-Based Procedural Skills Training Package , 2014 .

[6]  James M. Gerard,et al.  Validation of Global Rating Scale and Checklist Instruments for the Infant Lumbar Puncture Procedure , 2013, Simulation in healthcare : journal of the Society for Simulation in Healthcare.

[7]  Ryan Brydges,et al.  Patient Outcomes in Simulation-Based Medical Education: A Systematic Review , 2013, Journal of General Internal Medicine.

[8]  James M. Gerard,et al.  Are Pediatric Interns Prepared to Perform Infant Lumbar Punctures?: A Multi-Institutional Descriptive Study , 2013, Pediatric emergency care.

[9]  Daniel M. Fein,et al.  Interns' Success With Clinical Procedures in Infants After Simulation Training , 2013, Pediatrics.

[10]  B. Jolly,et al.  Making sense of work‐based assessment: ask the right questions, in the right way, about the right things, of the right people , 2012, Medical education.

[11]  J. Kogan,et al.  Opening the black box of clinical skills assessment via observation: a conceptual model , 2011, Medical education.

[12]  David O. Kessler,et al.  A Randomized Trial of Simulation-Based Deliberate Practice for Infant Lumbar Puncture Skills , 2011, Simulation in healthcare : journal of the Society for Simulation in Healthcare.

[13]  M. Howell,et al.  Commentary: Is the glass half empty? Code blue training in the modern era. , 2011, Academic medicine : journal of the Association of American Medical Colleges.

[14]  J. Barsuk,et al.  Does Simulation-Based Medical Education With Deliberate Practice Yield Better Results Than Traditional Clinical Education? A Meta-Analytic Comparative Review of the Evidence , 2011, Academic medicine : journal of the Association of American Medical Colleges.

[15]  M. Auerbach,et al.  Infant Lumbar Puncture: POISE Pediatric Procedure Video , 2011 .

[16]  M. Rothberg,et al.  Can Residents Learn to Be Good Doctors without Harming Patients? , 2010, Journal of General Internal Medicine.

[17]  Joshua D. Lenchus End of the “See One, Do One, Teach One” Era: The Next Generation of Invasive Bedside Procedural Instruction , 2010, The Journal of the American Osteopathic Association.

[18]  T. Swanwick,et al.  Workplace-based assessment. , 2009, British journal of hospital medicine.

[19]  S. Downing,et al.  Assessment in Health Professions Education , 2009 .

[20]  P. Pronovost,et al.  Beyond “see one, do one, teach one”: toward a different training paradigm , 2009, Quality & Safety in Health Care.

[21]  Carol L. Carraccio,et al.  From the educational bench to the clinical bedside: translating the Dreyfus developmental model to the learning of clinical skills. , 2008, Academic medicine : journal of the Association of American Medical Colleges.

[22]  G. Young Assessing Procedural Skills Training in Pediatric Residency Programs , 2008, Pediatrics.

[23]  D. Cook,et al.  Current concepts in validity and reliability for psychometric instruments: theory and application. , 2006, The American journal of medicine.

[24]  Roger Kneebone,et al.  Evaluating Clinical Simulations for Learning Procedural Skills: A Theory-Based Approach , 2005, Academic medicine : journal of the Association of American Medical Colleges.

[25]  C. V. D. van der Vleuten,et al.  Assessing professional competence: from methods to programmes , 2005, Medical education.

[26]  A. Ziv,et al.  Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review , 2005, Medical teacher.

[27]  K. Ludmerer The development of American medical education from the turn of the century to the era of managed care. , 2004, Clinical orthopaedics and related research.

[28]  S. Downing Validity: on the meaningful interpretation of assessment data , 2003, Medical education.

[29]  R. Reznick,et al.  Comparing the psychometric properties of checklists and global rating scales for assessing performance on an OSCE‐format examination , 1998, Academic medicine : journal of the Association of American Medical Colleges.

[30]  J D Gray,et al.  Global rating scales in residency education , 1996, Academic medicine : journal of the Association of American Medical Colleges.

[31]  A. Chickering,et al.  Seven Principles for Good Practice in Undergraduate Education , 1987, CORE.

[32]  M. Kendall A NEW MEASURE OF RANK CORRELATION , 1938 .

[33]  Yeung Sam Hung,et al.  A comparative analysis of Spearman's rho and Kendall's tau in normal and contaminated normal models , 2013, Signal Process..

[34]  C. V. D. van der Vleuten,et al.  Assessing professional competence : from methods to programmes , 2005 .