The usability of personal digital assistants (PDAs) for assessment of practical performance

Context  The administration of an objective structured clinical examination (OSCE) using paper checklists presents problems such as illegible handwriting, missing student names and/or numbers and lost checklists. Calculating and entering results is not only time‐consuming, but is subject to human errors, and feedback to students is rarely available. To rectify these problems, personal digital assistants (PDAs) and HaPerT software were acquired to replace paper checklists and provide automated results and feedback. This study sought to determine the usability of the PDA assessment system.