Keep talking: an analysis of participant utterances gathered using two concurrent think-aloud methods

This paper presents the results of a study that compared two think-aloud styles: the classic approach and a relaxed think-aloud on the nature and number of participant utterances produced. Overall, ten categories of utterance were extracted from the verbal data ranging from categories that had a direct impact on usability problem analysis, to those which simply described procedural actions. There were no categories of utterance that were unique to either method. The interactive think-aloud led to the production of more utterances that could be directly used in usability problem analysis. Participants provided explanations, opinions and recommendations during classic think-aloud, even though they were not instructed to do so. This finding suggests that the social context of testing may override the classic instruction to think aloud.

[1]  K. A. Ericsson,et al.  Protocol Analysis: Verbal Reports as Data , 1984 .

[2]  Victoria A. Bowers Concurrent versus Retrospective Verbal Protocol for Comparing Window Usability , 1990 .

[3]  David Williamson Shaffer,et al.  Education in the digital age , 2008 .

[4]  C. Hong Dance Dance Revolution , 2007 .

[5]  Kristina Höök,et al.  Hand in hand with the material: designing for suppleness , 2010, CHI.

[6]  Maria Ebling,et al.  On the contributions of different empirical data in usability testing , 2000, DIS '00.

[7]  Jacob Buur,et al.  Replacing usability testing with user dialogue , 1999, CACM.

[8]  Andy P. Field,et al.  Discovering Statistics Using SPSS , 2000 .

[9]  Ted Boren,et al.  Thinking aloud: reconciling theory and practice , 2000 .

[10]  Joseph S. Dumas,et al.  Moderating Usability Tests: Principles and Practices for Interacting: Principles and Practices for Interacting , 2008 .

[11]  Carol M. Barnum,et al.  Usability testing and research , 2001 .

[12]  Morten Hertzum,et al.  Scrutinising usability evaluation: does thinking aloud affect behaviour and mental workload? , 2009, Behav. Inf. Technol..

[13]  Shu Ching Yang,et al.  Reconceptualizing think-aloud methodology: refining the encoding and categorizing techniques via contextualized perspectives , 2003, Comput. Hum. Behav..

[14]  M.D.T. de Jong,et al.  Constructive Interaction: An Analysis of Verbal Interaction in a Usability Setting , 2006, IEEE Transactions on Professional Communication.

[15]  Mike Kuniavsky,et al.  Observing the User Experience: A Practitioner's Guide to User Research (Morgan Kaufmann Series in Interactive Technologies) (The Morgan Kaufmann Series in Interactive Technologies) , 2003 .

[16]  Kasper Hornbæk,et al.  What do usability evaluators do in practice?: an explorative study of think-aloud testing , 2006, DIS '06.

[17]  Phil Carter Liberating usability testing , 2007, INTR.

[18]  M. Chi Quantifying Qualitative Analyses of Verbal Data: A Practical Guide , 1997 .

[19]  E. Krahmer,et al.  Thinking about thinking aloud: a comparison of two verbal protocols for usability testing , 2004, IEEE Transactions on Professional Communication.

[20]  Wolmet Barendregt,et al.  Predicting effectiveness of children participants in user testing based on personality characteristics , 2007, Behav. Inf. Technol..

[21]  J. Löwgren From HCI to interaction design , 2001 .

[22]  John C. McCarthy,et al.  Technology as experience , 2004, INTR.

[23]  Hans Wilhelm I am lost , 1997 .

[24]  Morten Hertzum,et al.  Cultural cognition in usability evaluation , 2009, Interact. Comput..

[25]  Qingxin Shi,et al.  A field study of the relationship and communication between Chinese evaluators and users in thinking aloud usability tests , 2008, NordiCHI.

[26]  Arnold P. O. S. Vermeeren,et al.  Assessing usability evaluation methods on their effectiveness to elicit verbal comments from children subjects , 2003, IDC '03.