Validating a global rating scale to monitor individual resident learning curves during arthroscopic knee meniscal repair.

PURPOSE To determine whether a global rating scale (GRS) with construct validity can also be used to assess the learning curve of individual orthopaedic trainees during simulated arthroscopic knee meniscal repair. METHODS An established arthroscopic GRS was used to evaluate the technical skill of 19 orthopaedic residents performing a standardized arthroscopic meniscal repair in a bioskills laboratory. The residents had diagnostic knee arthroscopy experience but no experience with arthroscopic meniscal repair. Residents were videotaped performing an arthroscopic meniscal repair on 12 separate occasions. Their performance was assessed by use of the GRS and motion analysis objectively measuring the time taken to complete tasks, path length of the subject's hands, and number of hand movements. One author assessed all 228 videos, whereas 2 other authors rated 34 randomly selected videos, testing the interobserver reliability of the GRS. The validity of the GRS was tested against the motion analysis. RESULTS Objective assessment with motion analysis defined the surgeon's learning curve, showing significant improvement by each subject over 12 episodes (P < .0001). The GRS also showed a similar learning curve with significant improvements in performance (P < .0001). The median GRS score improved from 15 of 34 (interquartile range, 14 to 17) at baseline to 22 of 34 (interquartile range, 19 to 23) in the final period. There was a moderate correlation (P < .0001, Spearman test) between the GRS and motion analysis parameters (r = -0.58 for time, r = -0.58 for path length, and r = -0.51 for hand movements). The inter-rater reliability among 3 trained assessors using the GRS was excellent (Cronbach α = 0.88). CONCLUSIONS When compared with motion analysis, an established arthroscopic GRS, with construct validity, also offers a moderately feasible method to monitor the learning curve of individual residents during simulated knee meniscal repair. CLINICAL RELEVANCE An arthroscopic GRS can be used for monitoring skill improvement during knee meniscal repair and has the potential for use as a training and assessment tool in the real operating room.

[1]  Katherine M. James,et al.  Duty hour recommendations and implications for meeting the ACGME core competencies: views of residency directors. , 2011, Mayo Clinic proceedings.

[2]  A. Darzi,et al.  Validation of orthopaedic bench models for trauma surgery. , 2008, The Journal of bone and joint surgery. British volume.

[3]  K. A. Ericsson,et al.  Deliberate practice and acquisition of expert performance: a general overview. , 2008, Academic emergency medicine : official journal of the Society for Academic Emergency Medicine.

[4]  T Khan,et al.  Learning and retaining simulated arthroscopic meniscal repair skills. , 2012, The Journal of bone and joint surgery. American volume.

[5]  Jonathan L Rees,et al.  Simulated hip arthroscopy skills: learning curves with the lateral and supine patient positions: a randomized trial. , 2012, The Journal of bone and joint surgery. American volume.

[6]  Chetan S Modi,et al.  Computer-simulation training for knee and shoulder arthroscopic surgery. , 2010, Arthroscopy : the journal of arthroscopic & related surgery : official publication of the Arthroscopy Association of North America and the International Arthroscopy Association.

[7]  Rajesh Aggarwal,et al.  An Evaluation of the Feasibility, Validity, and Reliability of Laparoscopic Skills Assessment in the Operating Room , 2007, Annals of surgery.

[8]  C. Patel,et al.  Learning curve for robotic-assisted laparoscopic colorectal surgery , 2010, Surgical Endoscopy.

[9]  Howard Cottam,et al.  The European working time directive has a negative impact on surgical training in the UK. , 2011, The surgeon : journal of the Royal Colleges of Surgeons of Edinburgh and Ireland.

[10]  Ingrid Philibert,et al.  New requirements for resident duty hours. , 2002, JAMA.

[11]  Jonathan L Rees,et al.  Motion analysis: a validated method for showing skill levels in arthroscopy. , 2008, Arthroscopy : the journal of arthroscopic & related surgery : official publication of the Arthroscopy Association of North America and the International Arthroscopy Association.

[12]  J. Lubowitz,et al.  Arthroscopic rotator cuff repair: the learning curve. , 2005, Arthroscopy : the journal of arthroscopic & related surgery : official publication of the Arthroscopy Association of North America and the International Arthroscopy Association.

[13]  Ara Darzi,et al.  Skills acquisition and assessment after a microsurgical skills course for ophthalmology residents. , 2009, Ophthalmology.

[14]  James A. Giles Surgical training and the European Working Time Directive: The role of informal workplace learning. , 2010, International journal of surgery.

[15]  V. Beneš,et al.  The European Working Time Directive and the Effects on Training of Surgical Specialists (Doctors in Training) , 2006, Acta Neurochirurgica.

[16]  E. Verdaasdonk,et al.  Objective assessment of technical surgical skills , 2010, The British journal of surgery.

[17]  Mohsen Tavakol,et al.  Assessing the skills of surgical residents using simulation. , 2008, Journal of surgical education.

[18]  Ara Darzi,et al.  The human face of simulation: patient-focused simulation training. , 2006, Academic medicine : journal of the Association of American Medical Colleges.

[19]  J. Sekiya,et al.  Diagnostic knee arthroscopy: a pilot study to evaluate surgical skills. , 2012, Arthroscopy : the journal of arthroscopic & related surgery : official publication of the Arthroscopy Association of North America and the International Arthroscopy Association.

[20]  Ara Darzi,et al.  Surgical skill is predicted by the ability to detect errors. , 2005, American journal of surgery.

[21]  J L Rees,et al.  Retention of arthroscopic shoulder skills learned with use of a simulator. Demonstration of a learning curve and loss of performance level after a time delay. , 2009, The Journal of bone and joint surgery. American volume.

[22]  Ara Darzi,et al.  The relationship between motion analysis and surgical technical assessments. , 2002, American journal of surgery.

[23]  A. Darzi,et al.  The use of electromagnetic motion tracking analysis to objectively measure open surgical skill in the laboratory-based model. , 2001, Journal of the American College of Surgeons.

[24]  R. Reznick,et al.  Objective structured assessment of technical skill (OSATS) for surgical residents , 1997, The British journal of surgery.

[25]  Ara Darzi,et al.  The reliability of multiple objective measures of surgery and the role of human performance. , 2005, American journal of surgery.

[26]  K. A. Ericsson,et al.  Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains. , 2004, Academic medicine : journal of the Association of American Medical Colleges.

[27]  Ethan D Grober,et al.  Validation of novel and objective measures of microsurgical skill: Hand‐motion analysis and stereoscopic visual acuity , 2003, Microsurgery.

[28]  J. Nyland,et al.  A biomechanical comparison of the FasT-Fix meniscal repair suture system and the RapidLoc device in cadaver meniscus. , 2006, Arthroscopy : the journal of arthroscopic & related surgery : official publication of the Arthroscopy Association of North America and the International Arthroscopy Association.

[29]  Abtin Alvand,et al.  Innate arthroscopic skills in medical students and variation in learning curves. , 2011, The Journal of bone and joint surgery. American volume.

[30]  R. Arciero,et al.  The development of an objective model to assess arthroscopic performance. , 2009, The Journal of bone and joint surgery. American volume.

[31]  Brett D Owens,et al.  Arthroscopic basic task performance in shoulder simulator model correlates with similar task performance in cadavers. , 2011, The Journal of bone and joint surgery. American volume.

[32]  J L Rees,et al.  Transferring simulated arthroscopic skills to the operating theatre: a randomised blinded study. , 2008, The Journal of bone and joint surgery. British volume.

[33]  R. Kneebone Simulation in surgical training: educational issues and practical implications , 2003, Medical education.

[34]  A. Darzi,et al.  Observational tools for assessment of procedural skills: a systematic review. , 2011, American journal of surgery.

[35]  R. Satava,et al.  Virtual reality as a metric for the assessment of laparoscopic psychomotor skills , 2002, Surgical Endoscopy And Other Interventional Techniques.

[36]  T Khan,et al.  Identifying orthopaedic surgeons of the future: the inability of some medical students to achieve competence in basic arthroscopic tasks despite training: a randomised study. , 2011, The Journal of bone and joint surgery. British volume.

[37]  S. Swift,et al.  Institution and validation of an observed structured assessment of technical skills (OSATS) for obstetrics and gynecology residents and faculty. , 2006, American journal of obstetrics and gynecology.

[38]  Rajesh Aggarwal,et al.  Objective assessment of technical skills in cardiac surgery. , 2005, European journal of cardio-thoracic surgery : official journal of the European Association for Cardio-thoracic Surgery.

[39]  K. Moorthy,et al.  Bimodal assessment of laparoscopic suturing skills: Construct and concurrent validity , 2004, Surgical Endoscopy And Other Interventional Techniques.