Personas in Heuristic Evaluation: An Exploratory Study

Research problem: No study has explored how incorporating personas into heuristic evaluation of products, namely websites, affects the kinds of findings reported and the recommendations presented by usability evaluators. Research questions: (1) Do findings resulting from heuristic evaluations of a website without the use of personas differ from findings resulting from heuristic evaluations of the same website with the use of personas? (2) Do findings from persona-based heuristic evaluations in which evaluators develop their own personas differ from findings from persona-based heuristic evaluations in which evaluators are given personas? (3) If findings and recommendations are different, how do they differ? (4) How does the use of personas affect the evaluators' confidence in the findings of a heuristic evaluation? Literature review: First, previous research of heuristic evaluation has concluded that although heuristic evaluation is inexpensive and does not require advance planning, it has several shortcomings, including its too-intense focus on minor issues and its inability to capture all usability issues. Second, data-driven personas, which have long been a resource in user-centered design, have been suggested as a way to improve or enhance heuristic evaluation, and several studies suggest that usability professionals are indeed using personas in their evaluations. However, no empirical study has assessed heuristic evaluations that include personas. Methodology: In this exploratory study involving three sections of an advanced technical writing course, groups of evaluators conducted a heuristic evaluation of a website. Each section was randomly assigned a different condition with which they would conduct the heuristic evaluation: (a) a traditional heuristic evaluation, (b) a persona-led heuristic evaluation in which the personas were given to the evaluators, or (c) a persona-led heuristic evaluation in which the evaluators themselves created their own personas. Each group wrote a report identifying the major problems with the website and provided recommendations to solve the identified problems. The evaluators completed pretesting demographic surveys and posttesting confidence surveys. Results and discussion: This exploratory study found few detectable differences in the findings reported by groups that used personas in heuristic evaluation and groups that did not use personas. The groups that used personas were more likely to report findings related to navigation than the groups that did not use personas, while the groups that did not use personas were more likely to report findings related to design than the groups that used personas. The groups that created their own personas were more likely than the other groups to include complex issues in their reports and include language that directly references users and user needs. All groups were confident in their findings.

[1]  Jakob Nielsen,et al.  Heuristic evaluation of user interfaces , 1990, CHI '90.

[2]  Kasper Hornbæk,et al.  Ingredients and Meals Rather Than Recipes: A Proposal for Research That Does Not Treat Usability Evaluation Methods as Indivisible Wholes , 2011, Int. J. Hum. Comput. Interact..

[3]  Gilbert Cockton,et al.  Understanding Inspection Methods: Lessons from an Assessment of Heuristic Evaluation , 2001, BCS HCI/IHM.

[4]  Maria De Marsico,et al.  Evaluating web sites: exploiting user's expectations , 2004, Int. J. Hum. Comput. Stud..

[5]  Kasper Hornbæk,et al.  Analysis in practical usability evaluation: a survey study , 2012, CHI.

[6]  Joseph S. Dumas,et al.  Comparative usability evaluation (CUE-4) , 2008, Behav. Inf. Technol..

[7]  Joel U. Eden Distributed cognitive walkthrough (DCW): a walkthrough-style usability evaluation method based on theories of distributed cognition , 2007, C&C '07.

[8]  Robin Jeffries,et al.  User interface evaluation in the real world: a comparison of four techniques , 1991, CHI.

[9]  Jakob Nielsen,et al.  Finding usability problems through heuristic evaluation , 1992, CHI.

[10]  Nalini Kotamraju,et al.  Data-driven persona development , 2008, CHI.

[11]  Aleksandra B. Slavkovic,et al.  Novice heuristic evaluations of a complex interface , 1999, CHI EA '99.

[12]  Klaus Kaasgaard,et al.  Comparative usability evaluation , 2004, Behav. Inf. Technol..

[13]  Tomasz Miaskiewicz,et al.  A Preliminary Examination of Using Personas to Enhance User-Centered Design , 2009, AMCIS.

[14]  Silvia Mara Abrahão,et al.  Empirical validation of a usability inspection method for model-driven Web development , 2013, J. Syst. Softw..

[15]  Michael E. Atwood,et al.  What is gained and lost when using evaluation methods other than empirical testing , 1993 .

[16]  David G. Novick,et al.  Usability inspection methods after 15 years of research and practice , 2007, SIGDOC '07.

[17]  Andrew Sears,et al.  Heuristic Walkthroughs: Finding the Problems Without the Noise , 1997, Int. J. Hum. Comput. Interact..

[18]  Morten Hertzum,et al.  The Evaluator Effect: A Chilling Fact About Usability Evaluation Methods , 2001, Int. J. Hum. Comput. Interact..

[19]  Alan Cooper,et al.  The Inmates are Running the Asylum , 1999, Software-Ergonomie.

[20]  Robin Jeffries,et al.  Backtracking Events as Indicators of Usability Problems in Creation-Oriented Applications , 2012, TCHI.

[21]  John S. Pruitt,et al.  The Persona Lifecycle: Keeping People in Mind Throughout Product Design , 2006 .

[22]  Toni Granollers,et al.  Incorporation of users in the Evaluation of Usability by Cognitive Walkthrough , 2006 .

[23]  Kim Goodwin,et al.  Designing for the Digital Age: How to Create Human-Centered Products and Services , 2009 .

[24]  Jakob Nielsen,et al.  Homepage Usability: 50 Websites Deconstructed , 2001 .

[25]  Carol M. Barnum Usability Testing Essentials: Ready, Set...Test! , 2010 .

[26]  Heather W. Desurvire,et al.  Faster, cheaper!! Are usability inspection methods as effective as empirical testing? , 1994 .

[27]  Kasper Hornbæk,et al.  Making use of business goals in usability evaluation: an experiment with novice evaluators , 2008, CHI.