Exploring the Value of Usability Feedback Formats

The format used to present feedback from usability evaluations to developers affects whether problems are understood, accepted, and fixed. Yet, little research has investigated which formats are the most effective. We describe an explorative study where three developers assess 40 usability findings presented using five feedback formats. Our usability findings comprise 35 problems and 5 positive comments. Data suggest that feedback serves multiple purposes. Initially, feedback must convince developers about the relevance of a problem and convey an understanding of this. Feedback must next be easy to use and finally serve as a reminder of the problem. Prior to working with the feedback, developers rated redesign proposals, multimedia reports, and annotated screen dumps as more valuable than lists of problems, all of which were rated as more valuable than scenarios. After having spent some time working with the feedback to address the usability problems, there were no significant differences among the developers' ratings of the value of the different formats. This suggests that all of the formats may serve equally well as reminders in later stages of working with usability problems, but that redesign proposals, multimedia reports, and annotated screen dumps best address the initial feedback goals convincing developers that a usability problem exists and of conveying an understanding of the problem.

[1]  Effie Lai-Chong Law Evaluating the Downstream Utility of User Tests and Examining the Developer Effect: A Case Study , 2006, Int. J. Hum. Comput. Interact..

[2]  Kasper Hornbæk,et al.  Usability Inspection by Metaphors of Human Thinking Compared to Heuristic Evaluation , 2004, Int. J. Hum. Comput. Interact..

[3]  Robin Jeffries,et al.  Usability problem reports: helping evaluators communicate effectively with developers , 1994 .

[4]  Bruce Tognazzini,et al.  Usability testing in the real world , 1986, CHI '86.

[5]  Heather Desurvire,et al.  EMPIRICISM VERSUS JUDGEMENT: COMPARING USER INTERFACE EVALUATION METHODS ON A NEW TELEPHONE-BASED INTERFACE , 1991, SGCH.

[6]  Mary Beth Rosson,et al.  Usability Engineering in Practice , 2002 .

[7]  Georg Strom,et al.  Perception of Human-Centred Stories and Technical Descriptions when Analyzing and Negotiating Requirements , 2003, INTERACT.

[8]  S. Kennedy,et al.  Using video in the BNR usability lab , 1989, SGCH.

[9]  Joseph F. Dumas,et al.  A Practical Guide to Usability Testing , 1993 .

[10]  Joseph S. Dumas,et al.  Usability in practice: formative usability evaluations - evolution and revolution , 2002, CHI Extended Abstracts.

[11]  Mie Nørgaard,et al.  Evaluating usability: using models of argumentation to improve persuasiveness of usability feedback , 2008, DIS '08.

[12]  Gilbert Cockton Focus, Fit, and Fervor: Future Factors Beyond Play With the Interplay , 2006, Int. J. Hum. Comput. Interact..

[13]  Ahmed Seffah,et al.  Empowering software engineers in human-centered design , 2003, 25th International Conference on Software Engineering, 2003. Proceedings..

[14]  Robin Jeffries,et al.  User interface evaluation in the real world: a comparison of four techniques , 1991, CHI.

[15]  Kasper Hornbæk,et al.  What do usability evaluators do in practice?: an explorative study of think-aloud testing , 2006, DIS '06.

[16]  J. S. Dumas,et al.  Stimulating change through usability testing , 1989, SGCH.

[17]  Jakob Nielsen,et al.  Heuristic Evaluation of Prototypes (individual) , 2022 .

[18]  Michael J. Kahn,et al.  Formal usability inspections , 1994 .

[19]  Kasper Hornbæk,et al.  Comparing usability problems and redesign proposals as input to practical systems development , 2005, CHI.

[20]  Bonnie E. John,et al.  Tracking the effectiveness of usability evaluation methods , 1997, Behav. Inf. Technol..

[21]  John Karat,et al.  Maintaining a focus on user requirements throughout the development of clinical workstation software , 1997, CHI.

[22]  Mary Beth Rosson,et al.  Usability Engineering: Scenario-based Development of Human-Computer Interaction , 2001 .

[23]  Jan Stage,et al.  The Interplay Between Usability Evaluation and User Interaction Design , 2006, Int. J. Hum. Comput. Interact..

[24]  David R. Smith,et al.  Analyzing and communicating usability data: now that you have the data what do you do? a CHI'94 workshop , 1995, SGCH.

[25]  David A. Schell Usability Testing of Screen Design: Beyond Standards, Principles, and Guidelines , 1986 .

[26]  Andrew Sears,et al.  Heuristic Walkthroughs: Finding the Problems Without the Noise , 1997, Int. J. Hum. Comput. Interact..

[27]  Kasper Hornbæk,et al.  Cooperative usability testing: complementing usability tests with user-supported interpretation sessions , 2005, CHI Extended Abstracts.

[28]  Joseph S. Dumas,et al.  Oracle Corporation , 1994 .

[29]  Gilbert Cockton,et al.  Understanding Inspection Methods: Lessons from an Assessment of Heuristic Evaluation , 2001, BCS HCI/IHM.