Interactive learning with voting technology
暂无分享,去创建一个
ation of common neurological and cardiac findings, and their correlation with diagnosis. For each clinical finding, we created multimedia stems that included photographs, graphics and audio-video clips of simulators and SPs. We developed corresponding questions that were then reviewed by course directors (who evaluated content) and 2 educational psychologists with experience in test development (who evaluated items for structure, consistency and validity). All test items were matched and correlated with outcomes ⁄ competencies identified for medical students at the end of their first year. Evaluation of results and impact To date, 291 first year medical students have completed this examination and most have answered a questionnaire designed to gauge their attitudes toward the computer exercise. The mean score (percentage), standard deviation and range (18 total items) for the examination were 14.52 (80.7%), 1.83 and 9–18, respectively. We provided feedback to all students and opportunities for remediation prior to retaking the examination to those who scored below the 75% competency mark. A total of 87% of questionnaire respondents felt that the computer exercise covered important information learned during their first year; 96.5% agreed that they were given sufficient time to complete the computer exercise, and 93.5% felt the format was user-friendly. Thus, we developed computer-based outcome measures and successfully integrated these into our school’s competency assessment. Future work will assess skills retention and the correlation of student performance on this computer exercise with other outcome measures of clinical skills.