What do users really care about?: a comparison of usability problems found by users and experts on highly interactive websites

Expert evaluation methods, such as heuristic evaluation, are still popular in spite of numerous criticisms of their effectiveness. This paper investigates the usability problems found in the evaluation of six highly interactive websites by 30 users in a task-based evaluation and 14 experts using three different expert evaluation methods. A grounded theory approach was taken to categorize 935 usability problems from the evaluation. Four major categories emerged: Physical presentation, Content, Information Architecture and Interactivity. Each major category had between 5 and 16 sub-categories. The categories and sub-categories were then analysed for whether they were found by users only, experts only or both users and experts. This allowed us to develop an evidence-based set of 21 heuristics to assist in the development and evaluation of interactive websites.

[1]  Gavin Reid,et al.  Issues and Concerns , 2009 .

[2]  Randolph G. Bias,et al.  Research Methods for Human-Computer Interaction , 2010, J. Assoc. Inf. Sci. Technol..

[3]  Jakob Nielsen,et al.  Heuristic evaluation of user interfaces , 1990, CHI '90.

[4]  Gilbert Cockton,et al.  Understanding Inspection Methods: Lessons from an Assessment of Heuristic Evaluation , 2001, BCS HCI/IHM.

[5]  Austin Henderson,et al.  Interaction design: beyond human-computer interaction , 2002, UBIQ.

[6]  Jakob Nielsen,et al.  Improving a human-computer dialogue , 1990, CACM.

[7]  Jakob Nielsen,et al.  Enhancing the explanatory power of usability heuristics , 1994, CHI '94.

[8]  Ben Shneiderman,et al.  Designing The User Interface , 2013 .

[9]  Gregory D. Abowd,et al.  Human-Computer Interaction (3rd Edition) , 2003 .

[10]  Jacob Cohen A Coefficient of Agreement for Nominal Scales , 1960 .

[11]  W. Buxton Human-Computer Interaction , 1988, Springer Berlin Heidelberg.

[12]  Morten Hertzum,et al.  The Evaluator Effect: A Chilling Fact About Usability Evaluation Methods , 2001, Int. J. Hum. Comput. Interact..

[13]  Wayne D. Gray,et al.  Damaged Merchandise? A Review of Experiments That Compare Usability Evaluation Methods , 1998, Hum. Comput. Interact..

[14]  Morten Hertzum,et al.  Usability inspections by groups of specialists: perceived agreement in spite of disparate observations , 2002, CHI Extended Abstracts.

[15]  Cathleen Wharton,et al.  Testing a walkthrough methodology for theory-based design of walk-up-and-use interfaces , 1990, CHI '90.

[16]  Thomas Spyrou,et al.  Evaluating Usability Evaluation Methods: Criteria, Method and a Case Study , 2007, HCI.

[17]  Kasper Hornbæk,et al.  Dogmas in the assessment of usability evaluation methods , 2010, Behav. Inf. Technol..

[18]  Robin Jeffries,et al.  User interface evaluation in the real world: a comparison of four techniques , 1991, CHI.

[19]  Ram R. Bishu,et al.  Web Usability and Evaluation: Issues and Concerns , 2007, HCI.

[20]  K. Charmaz,et al.  The sage handbook of grounded theory , 2007 .

[21]  Jakob Nielsen,et al.  Usability engineering , 1997, The Computer Science and Engineering Handbook.

[22]  A. Strauss,et al.  Grounded Theory in Practice , 1997 .