New approaches to usability evaluation in software development: Barefoot and crowdsourcing

New approaches to reduce obstacles.Obstacles: resource constraints, limited understanding and resistance.Barefoot evaluations: reduction of limited understanding and resistance.Crowdsourcing evaluations: reduction of resource requirements. Usability evaluations provide software development teams with insights on the degree to which software applications enable users to achieve their goals, how fast these goals can be achieved, how easy an application is to learn and how satisfactory it is in use. Although such evaluations are crucial in the process of developing software systems with a high level of usability, their use is still limited in small and medium-sized software development companies. Many of these companies are e.g. unable to allocate the resources that are needed to conduct a full-fledged usability evaluation in accordance with a conventional approach.This paper presents and assesses two new approaches to overcome usability evaluation obstacles: a barefoot approach where software development practitioners are trained to drive usability evaluations; and a crowdsourcing approach where end users are given minimalist training to enable them to drive usability evaluations. We have evaluated how these approaches can reduce obstacles related to limited understanding, resistance and resource constraints. We found that these methods are complementary and highly relevant for software companies experiencing these obstacles. The barefoot approach is particularly suitable for reducing obstacles related to limited understanding and resistance while the crowdsourcing approach is cost-effective.

[1]  Jan Stage,et al.  The Impact of Usability Reports and User Test Observations on Developers' Understanding of Usability Data: An Exploratory Study , 2006, Int. J. Hum. Comput. Interact..

[2]  R. Felder,et al.  Learning and Teaching Styles in Engineering Education. , 1988 .

[3]  Jan Stage,et al.  What happened to remote usability testing?: an empirical study of three methods , 2007, CHI.

[4]  Timo Jokela,et al.  People, Organizations, and Processes: An Inquiry into the Adoption of User-Centered Design in Industry , 2006, Int. J. Hum. Comput. Interact..

[5]  Jan Stage,et al.  Obstacles to usability evaluation in practice: a survey of software development organizations , 2008, NordiCHI.

[6]  Debora Shaw,et al.  Handbook of usability testing: How to plan, design, and conduct effective tests , 1996 .

[7]  Jennifer Anne Thompson,et al.  Investigating the Effectiveness of Applying the Critical Incident Technique to Remote Usability Evaluation , 1999 .

[8]  Sarah Waterson,et al.  In the lab and out in the wild: remote web usability testing for mobile devices , 2002, CHI Extended Abstracts.

[9]  Jan Stage,et al.  Instant data analysis: conducting usability evaluations in a day , 2004, NordiCHI '04.

[10]  Jakob Nielsen,et al.  Iterative user-interface design , 1993, Computer.

[11]  Jose Carlos Castillo,et al.  The User-Reported Critical IncidentMethod for Remote Usability Evaluation , 1997 .

[12]  Adrian Holzer,et al.  Pdot: participatory design online tool , 2014, CHI Extended Abstracts.

[13]  Jan Gulliksen,et al.  Making a difference: a survey of the usability profession in Sweden , 2004, NordiCHI '04.

[14]  Stephanie Rosenbaum,et al.  A toolkit for strategic usability: results from workshops, panels, and surveys , 2000, CHI.

[15]  Danilo Caivano,et al.  Usability evaluation: a survey of software development organizations , 2011, SEKE.

[16]  Erik Frøkjær,et al.  Prediction of usability: Comparing method combinations , 1999 .

[17]  Peter C. Wright,et al.  The use of think-aloud evaluation methods in design , 1991, SGCH.

[18]  Joseph S. Dumas,et al.  Comparative usability evaluation (CUE-4) , 2008, Behav. Inf. Technol..

[19]  Anders Bruun,et al.  Let your users do the testing: a comparison of three remote asynchronous usability testing methods , 2009, CHI.

[20]  Ryan West,et al.  Automated summative usability studies: an empirical evaluation , 2006, CHI.

[21]  Anders Bruun,et al.  Training software development practitioners in usability testing: an assessment acceptance and prioritization , 2012, OZCHI.

[22]  Jan Gulliksen,et al.  Understanding the context of design: towards tactical user centered design , 2008, NordiCHI.

[23]  Marco Winckler,et al.  Usability remote evaluation for WWW , 2000, CHI Extended Abstracts.

[24]  Tim Bosenick,et al.  Remote Usability Tests - An Extension of the Usability Toolbox for Online-Shops , 2007, HCI.

[25]  David R. Millen Remote usability evaluation: user participation in the design of a Web-based email service , 1999, SIGG.

[26]  Dennis R. Wixon Evaluating usability methods: why the current literature fails the practitioner , 2003, INTR.

[27]  Eric Schaffer,et al.  Institutionalization of Usability: A Step-By-Step Guide , 2004 .

[28]  Peter Johnson,et al.  Helping and hindering user involvement — a tale of everyday design , 1997, CHI.

[29]  Kasper Hornbæk,et al.  Dogmas in the assessment of usability evaluation methods , 2010, Behav. Inf. Technol..

[30]  Tonya L. Smith-Jackson,et al.  Supporting novice usability practitioners with usability engineering tools , 2009, Int. J. Hum. Comput. Stud..

[31]  Robert C. Williges,et al.  Criteria For Evaluating Usability Evaluation Methods , 2001, Int. J. Hum. Comput. Interact..

[32]  Deborah Hix,et al.  Remote usability evaluation: can users report their own critical incidents? , 1998, CHI Conference Summary.

[33]  Alon Y. Halevy,et al.  Crowdsourcing systems on the World-Wide Web , 2011, Commun. ACM.

[34]  Kasper Hornbæk,et al.  Work-domain knowledge in usability evaluation: Experiences with Cooperative Usability Testing , 2010, J. Syst. Softw..

[35]  Daqing Zhang,et al.  China's barefoot doctor: past, present, and future , 2008, The Lancet.

[36]  Natalia Juristo Juzgado,et al.  Guidelines for Eliciting Usability Functionalities , 2007, IEEE Transactions on Software Engineering.

[37]  Effie Lai-Chong Law Evaluating the Downstream Utility of User Tests and Examining the Developer Effect: A Case Study , 2006, Int. J. Hum. Comput. Interact..

[38]  Terence S. Andre,et al.  A structured process for transforming usability data into usability information , 2007 .

[39]  Antonella De Angeli,et al.  Systematic evaluation of e-learning systems: an experimental validation , 2006, NordiCHI '06.

[40]  Jean Scholtz,et al.  Methods for Identifying Usability Problems with Web Sites , 1998, EHCI.

[41]  Anders Bruun,et al.  The effect of task assignments and instruction types on remote asynchronous usability testing , 2012, CHI.

[42]  Jakob Nielsen,et al.  Finding usability problems through heuristic evaluation , 1992, CHI.

[43]  Michael J. Prince,et al.  Inductive Teaching and Learning Methods: Definitions, Comparisons, and Research Bases , 2006 .

[44]  Virginia Tech,et al.  A Structured Process for Transforming Usability Data into Usability Information , 2007 .

[45]  Jan Stage,et al.  Supporting problem identification in usability evaluations , 2005, OZCHI.

[46]  Jonathan Grudin,et al.  Organizational obstacles to interface design and development: two participant-observer studies , 1994, TCHI.

[47]  Kasper Hornbæk,et al.  A Study of the Evaluator Effect in Usability Testing , 2008, Hum. Comput. Interact..

[48]  Jean Scholtz A case study: developing a remote, rapid, and automated usability testing methodology for on-line books , 1999, Proceedings of the 32nd Annual Hawaii International Conference on Systems Sciences. 1999. HICSS-32. Abstracts and CD-ROM of Full Papers.

[49]  Jan Stage,et al.  Training software developers and designers to conduct usability evaluations , 2008, I-USED.

[50]  M. Bruce,et al.  Managing external design professionals in the product development process , 1994 .

[51]  Morten Hertzum,et al.  Problem Prioritization in Usability Evaluation: From Severity Assessments Toward Impact on Design , 2006, Int. J. Hum. Comput. Interact..

[53]  Carl Gutwin,et al.  A comparison of usage evaluation and inspection methods for assessing groupware usability , 2001, GROUP.

[54]  Dennis R. Wixon,et al.  Making a difference—the impact of inspections , 1996, CHI.