Introducing recent medical graduates as members of Script Concordance Test expert reference panels: what impact?

The Script Concordance Test (SCT) is being increasingly used in professional development in clinical reasoning, with linear progression in performance in SCT’s observed with increasing clinical experience. One of the limiting factors for the SCT is potential burnout in expert reference panel (ERP) members, which we have attempted to address by the introduction of recent medical graduates as panel members. We sought to evaluate the effect of introducing recent medical graduates in to our ERP’s on pass/fail decisions in the final clinical reasoning examination of the 6-year undergraduate program of the University of Adelaide, Australia. We engaged an ERP comprising 50 faculty members from three collaborating universities and 13 recent medical graduates to answer on line an identical 20 case scenario, 50 question multidisciplinary SCT twice 6 months apart. The questions were used in high stakes end of year assessment of 5 th year medical students (n=132). The pass mark set by the experienced, specialist members of the panel was 49.6% and this increased to 50.4% by addition of recent medical graduates to the panel. This difference would have had no effect on fail rates estimated from the data from the cohort of 132 medical student candidates. In the context of assessment of clinical reasoning in medical programs, recent medical graduates are suitable members of SCT ERP’s, and their contribution can enrich the panel and might help to minimise risk of burnout of more experienced faculty.

[1]  B. Charlin,et al.  Bi-cultural, bi-national benchmarking and assessment of clinical reasoning in Obstetrics and Gynaecology , 2016 .

[2]  S. Lotfipour,et al.  Challenging script concordance test reference standard by evidence: do judgments by emergency medicine consultants agree with likelihood ratios? , 2014, International Journal of Emergency Medicine.

[3]  B. Charlin,et al.  The Practicum Script Concordance Test: An Online Continuing Professional Development Format to Foster Reflection on Clinical Practice , 2013, The Journal of continuing education in the health professions.

[4]  Peter J Pronovost,et al.  25-Year summary of US malpractice claims for diagnostic errors 1986–2010: an analysis from the National Practitioner Data Bank , 2013, BMJ quality & safety.

[5]  S. Durning,et al.  Clarifying assumptions to enhance our understanding and assessment of clinical reasoning. , 2013, Academic medicine : journal of the Association of American Medical Colleges.

[6]  B. Charlin,et al.  Script concordance testing: From theory to practice: AMEE Guide No. 75 , 2013, Medical teacher.

[7]  B. Charlin,et al.  Summative assessment of 5th year medical students’ clinical reasoning by script concordance test: requirements and challenges , 2012, BMC medical education.

[8]  Bernard Charlin,et al.  Script concordance testing: a review of published validity evidence , 2011, Medical education.

[9]  B. Charlin,et al.  The relationship between script concordance test scores in an obste rics-gynecology rotation and global performance assessments in the curriculum , 2011, International Journal of Medical Education.

[10]  J. Jelovsek,et al.  Assessment of intraoperative judgment during gynecologic surgery using the Script Concordance Test. , 2010, American journal of obstetrics and gynecology.

[11]  S. Brédart,et al.  Reasoning versus knowledge retention and ascertainment throughout a problem‐based learning curriculum , 2009, Medical education.

[12]  C. Chalk,et al.  The Script Concordance Test: A New Tool Assessing Clinical Judgement in Neurology , 2009, Canadian Journal of Neurological Sciences / Journal Canadien des Sciences Neurologiques.

[13]  B. Charlin,et al.  The script concordance test in radiation oncology: validation study of a new tool to assess clinical reasoning , 2009, Radiation oncology.

[14]  P. Duggan Development of a Script Concordance Test using an Electronic Voting System. , 2007 .

[15]  S. Meterissian A Novel Method of Assessing Clinical Reasoning in Surgical Residents , 2006, Surgical innovation.

[16]  T. Roberts,et al.  Standard Setting for Clinical Competence at Graduation from Medical School: A Comparison of Passing Scores Across Five Medical Schools , 2006, Advances in health sciences education : theory and practice.

[17]  B. Charlin,et al.  Assessment in the context of uncertainty: how many members are needed on the panel of reference of a script concordance test? , 2005, Medical education.

[18]  K. Eva What every teacher needs to know about clinical reasoning , 2005, Medical education.

[19]  Bernard Charlin,et al.  Measurement of perception and interpretation skills during radiology training: utility of the script concordance approach , 2004, Medical teacher.

[20]  C. V. D. van der Vleuten,et al.  Measurement of clinical reflective capacity early in training as a predictor of clinical reasoning performance at the end of residency: an experimental study on the script concordance test , 2001, Medical education.

[21]  C. V. D. van der Vleuten,et al.  The Script Concordance Test: A Tool to Assess the Reflective Clinician , 2000, Teaching and learning in medicine.

[22]  H P Boshuizen,et al.  Scripts and Medical Diagnostic Knowledge: Theory and Applications for Clinical Reasoning Instruction and Research , 2000, Academic medicine : journal of the Association of American Medical Colleges.

[23]  E. Beresford,et al.  Uncertainty and the shaping of medical decisions. , 1991, The Hastings Center report.