A Longitudinal Evaluation of an Educational Software Program: A Case Study of Urinalysis‐Tutor™

Purpose. To examine students' learning before and after revising an educational software program and to explore students' patterns of use of an interactive feature that compares images. Method. Study participants were 466 University of Washington School of Medicine students. Two cohorts of students (one in 1996 and one in 1997) used the original version of the software. Following analysis of the students' learning, the software program was modified based on instructional design principles pertaining to visual learning and concept acquisition. A 1998 cohort of students used the revised program and their performance was compared with that of the 1996 cohort. Analyses were based on pre- and post-test scores, data collected from the observation of students, and navigational pathways tracked by the program. Results. There was very little difference in the overall performances of the students who used the original program and those who used the revised program. Error analysis focusing on 11 conceptual areas showed that reductions in errors occurred for six of 11 concepts, with statistically significant reductions of errors for two concepts. Additional navigational data collected in 1998 showed that students used an interactive feature for comparing images in different patterns. The data showed a positive association between performance and the anchored viewing mode of image display. Conclusions. While this study cannot point to specific design components that facilitated or hindered learning, it demonstrated a potential benefit of linking usage-pattern data and performance. Future studies should evaluate design factors that affect usage patterns and performances based on navigational data collected while students interact with software programs.

[1]  G R Norman,et al.  The inadequacy of recent research on computer‐assisted instruction , 1991, Academic medicine : journal of the Association of American Medical Colleges.

[2]  M. Gardiner,et al.  Applying cognitive psychology to user-interface design , 1987 .

[3]  Jacob Cohen Statistical Power Analysis for the Behavioral Sciences , 1969, The SAGE Encyclopedia of Research Design.

[4]  P Dev,et al.  Conceptual change and computer-assisted instruction. , 1993, Proceedings. Symposium on Computer Applications in Medical Care.

[5]  M. Weinger,et al.  Visual Display Format Affects the Ability of Anesthesiologists to Detect Acute Physiologic Changes: A Laboratory Study Employing a Clinical Display Simulator , 1995, Anesthesiology.

[6]  C P Friedman,et al.  The research we should be doing , 1994, Academic medicine : journal of the Association of American Medical Colleges.

[7]  T. Moberg,et al.  Educational technology to facilitate medical students' learning: background paper 2 of the medical school objectives project. , 1999, Academic medicine : journal of the Association of American Medical Colleges.

[8]  J Halama,et al.  Evaluation of Web‐based Computer‐aided Instruction in a Basic Science Course , 2000, Academic medicine : journal of the Association of American Medical Colleges.

[9]  K. B. Johnson,et al.  Quantifying the Literature of Computer‐aided Instruction in Medical Education , 2000, Academic medicine : journal of the Association of American Medical Colleges.

[10]  Sara Kim,et al.  Patterns of image comparison using compare and contrast feature in Urinalysis Tutor™ , 2000, Br. J. Educ. Technol..

[11]  J Glenn,et al.  A consumer‐oriented model for evaluating computer‐assisted instructional materials for medical education , 1996, Academic medicine : journal of the Association of American Medical Colleges.

[12]  Robert D. Tennyson,et al.  Concept Learning Effectiveness Using Prototype and Skill Development Presentation Forms. , 1981 .

[13]  Chris Marshall,et al.  Design guidelines , 1987 .

[14]  Michael J. Hannafin,et al.  Empirically-based guidelines for the design of interactive multimedia , 1993 .