Mobile Technology for the Facilitation of Direct Observation and Assessment of Student Performance

Background: We developed, implemented, and assessed a web-based clinical evaluation application (i.e., CEX app) for Internet-enabled mobile devices, including mobile phones. The app displays problem-specific checklists that correspond to training problems created by the Clerkship Directors in Internal Medicine. Purpose: We hypothesized that use of the CEX app for directly observing students’ clinical skills would be feasible and acceptable, and would demonstrate adequate reliability and validity. Methods: Between July 2010 and February 2012, 266 third-year medical students completed 5 to 10 formative CEXs during their internal medicine clerkship. The observers (attendings and residents), who performed the CEX, used the app to guide and document their observations, record their time observing and giving feedback to the students, and their overall satisfaction with the CEX app. Interrater reliability and validity were assessed with 17 observers who viewed 6 videotaped student–patient encounters, and by measuring the correlation between student CEX scores and their scores on subsequent standardized-patient Objective Structured Clinical Examination (OSCE) exams. Results: A total of 2,523 CEXs were completed by 411 observers. The average number of evaluations per student was 9.8 (± 1.8 SD), and the average number of CEXs completed per observer was 6 (± 11.8 SD). Observers spent less than 10 min on 45.3% of the CEXs and 68.6% of the feedback sessions. An overwhelming majority of observers (90.6%) reported satisfaction with the CEX. Interrater reliability was measured at 0.69 among the observers viewing the videotapes, and their ratings discriminated between competent and noncompetent performances. Student CEX grades, however, did not correlate with their end of 3rd-year OSCE scores. Conclusions: The use of this CEX app is feasible and it captures students’ clinical performance data with a high rate of user satisfaction. Our embedded checklists had adequate interrater reliability and concurrent validity. The grades measured on this app, however, were not predictive of subsequent student performance.

[1]  S. Downing,et al.  Toward Meaningful Evaluation of Clinical Competence: The Role of Direct Observation in Clerkship Ratings , 2004, Academic medicine : journal of the Association of American Medical Colleges.

[2]  J. Kogan,et al.  Feasibility, Reliability, and Validity of the Mini-Clinical Evaluation Exercise (mCEX) in a Medicine Core Clerkship , 2003, Academic medicine : journal of the Association of American Medical Colleges.

[3]  E. Bass,et al.  Evaluation of a national curriculum reform effort for the medicine core clerkship , 2000, Journal of General Internal Medicine.

[4]  Andrew D. Boyd,et al.  An 'Honest Broker' mechanism to maintain privacy for patient care and academic medical research , 2007, Int. J. Medical Informatics.

[5]  J. Kogan,et al.  Opening the black box of clinical skills assessment via observation: a conceptual model , 2011, Medical education.

[6]  Anda K. Kuo,et al.  Does direct observation improve medical students' clerkship experiences? , 2005, Medical education.

[7]  Kevin Kavanaugh,et al.  The implementation of a mobile problem-specific electronic CEX for assessing directly observed student—patient encounters , 2010, Medical education online.

[8]  J. Crossley,et al.  Implementing the undergraduate mini‐CEX: a tailored approach at Southampton University , 2009, Medical education.

[9]  Jennifer Cleland,et al.  Identifying the factors that determine feedback given to undergraduate medical students following formative mini‐CEX assessments , 2007, Medical education.

[10]  D. Solomon,et al.  Using Patient Encounter Logs for Mandated Clinical Encounters in an Internal Medicine Clerkship , 2009, Teaching and learning in medicine.

[11]  M. Donnelly,et al.  What Do Faculty Observe of Medical Students' Clinical Performance? , 2006, Teaching and learning in medicine.

[12]  Gary Ferenchick,et al.  Just In Time: Technology to Disseminate Curriculum and Manage Educational Requirements With Mobile Technology , 2008, Teaching and learning in medicine.

[13]  S. Downing,et al.  Direct observation in medical education: a review of the literature and evidence for validity. , 2009, The Mount Sinai journal of medicine, New York.

[14]  Feasibility, Reliability and User Satisfaction With a PDA-Based Mini-CEX to Evaluate the Clinical Skills of Third-Year Medical Students , 2007, Teaching and learning in medicine.

[15]  Lisa D Howley,et al.  Direct Observation of Students during Clerkship Rotations: A Multiyear Descriptive Study , 2004, Academic medicine : journal of the Association of American Medical Colleges.

[16]  Eric S Holmboe,et al.  Effects of Training in Direct Observation of Medical Residents' Clinical Competence , 2004, Annals of Internal Medicine.

[17]  Internal Medicine Core Clerkships Experience With Core Problem Lists: Results From a National Survey of Clerkship Directors in Internal Medicine , 2009, Teaching and learning in medicine.