Using cloud-based mobile technology for assessment of competencies among medical students

Valid, direct observation of medical student competency in clinical settings remains challenging and limits the opportunity to promote performance-based student advancement. The rationale for direct observation is to ascertain that students have acquired the core clinical competencies needed to care for patients. Too often student observation results in highly variable evaluations which are skewed by factors other than the student’s actual performance. Among the barriers to effective direct observation and assessment include the lack of effective tools and strategies for assuring that transparent standards are used for judging clinical competency in authentic clinical settings. We developed a web-based content management system under the name, Just in Time Medicine (JIT), to address many of these issues. The goals of JIT were fourfold: First, to create a self-service interface allowing faculty with average computing skills to author customizable content and criterion-based assessment tools displayable on internet enabled devices, including mobile devices; second, to create an assessment and feedback tool capable of capturing learner progress related to hundreds of clinical skills; third, to enable easy access and utilization of these tools by faculty for learner assessment in authentic clinical settings as a means of just in time faculty development; fourth, to create a permanent record of the trainees’ observed skills useful for both learner and program evaluation. From July 2010 through October 2012, we implemented a JIT enabled clinical evaluation exercise (CEX) among 367 third year internal medicine students. Observers (attending physicians and residents) performed CEX assessments using JIT to guide and document their observations, record their time observing and providing feedback to the students, and their overall satisfaction. Inter-rater reliability and validity were assessed with 17 observers who viewed six videotaped student-patient encounters and by measuring the correlation between student CEX scores and their scores on subsequent standardized-patient OSCE exams. A total of 3567 CEXs were completed by 516 observers. The average number of evaluations per student was 9.7 (±1.8 SD) and the average number of CEXs completed per observer was 6.9 (±15.8 SD). Observers spent less than 10 min on 43–50% of the CEXs and 68.6% on feedback sessions. A majority of observers (92%) reported satisfaction with the CEX. Inter-rater reliability was measured at 0.69 among all observers viewing the videotapes and these ratings adequately discriminated competent from non-competent performance. The measured CEX grades correlated with subsequent student performance on an end-of-year OSCE. We conclude that the use of JIT is feasible in capturing discrete clinical performance data with a high degree of user satisfaction. Our embedded checklists had adequate inter-rater reliability and concurrent and predictive validity.

[1]  J. Kogan,et al.  Opening the black box of clinical skills assessment via observation: a conceptual model , 2011, Medical education.

[2]  Lisa D Howley,et al.  Direct Observation of Students during Clerkship Rotations: A Multiyear Descriptive Study , 2004, Academic medicine : journal of the Association of American Medical Colleges.

[3]  Michael E Whitcomb More on competency-based education. , 2004, Academic medicine : journal of the Association of American Medical Colleges.

[4]  J. Kogan,et al.  Realising the potential value of feedback , 2012, Medical education.

[5]  S. Downing,et al.  Toward Meaningful Evaluation of Clinical Competence: The Role of Direct Observation in Clerkship Ratings , 2004, Academic medicine : journal of the Association of American Medical Colleges.

[6]  M. Whitcomb Competency-based graduate medical education? Of course! But how should competency be assessed? , 2002, Academic medicine : journal of the Association of American Medical Colleges.

[7]  Gary Ferenchick,et al.  Just In Time: Technology to Disseminate Curriculum and Manage Educational Requirements With Mobile Technology , 2008, Teaching and learning in medicine.

[8]  Feasibility, Reliability and User Satisfaction With a PDA-Based Mini-CEX to Evaluate the Clinical Skills of Third-Year Medical Students , 2007, Teaching and learning in medicine.

[9]  Eric S Holmboe,et al.  Effects of Training in Direct Observation of Medical Residents' Clinical Competence , 2004, Annals of Internal Medicine.

[10]  Jennifer Cleland,et al.  Identifying the factors that determine feedback given to undergraduate medical students following formative mini‐CEX assessments , 2007, Medical education.

[11]  L. Lingard,et al.  Learning from clinical work: the roles of learning cues and credibility judgements , 2012, Medical education.

[12]  Andrew D. Boyd,et al.  An 'Honest Broker' mechanism to maintain privacy for patient care and academic medical research , 2007, Int. J. Medical Informatics.

[13]  R. L. Ebel,et al.  Estimation of the reliability of ratings , 1951 .

[14]  Kevin Kavanaugh,et al.  The implementation of a mobile problem-specific electronic CEX for assessing directly observed student—patient encounters , 2010, Medical education online.

[15]  J. Crossley,et al.  Implementing the undergraduate mini‐CEX: a tailored approach at Southampton University , 2009, Medical education.

[16]  David J Solomon The rating reliability calculator , 2004, BMC medical research methodology.