Multimodal Interaction: Real Context Studies on Mobile Digital Artefacts

The way users interact with mobile applications varies according to the context where they are. We conducted a study where users had to manipulate a multimodal questionnaire in 4 different contexts (home, park, subway and driving), considering different variables (lighting, noise, position, movement, type of content, number of people surrounding the user and time constraints) that affect interaction. This study aimed at understanding the effect of the context variables in users' choices regarding the interaction modalities available (voice, gestures, etc). We describe the results of our study, eliciting situations where users adopted specific modalities and the reasons for that. Accordingly, we draw conclusions on users' preferences regarding interaction modalities on real life contexts.

[1]  Jesper Kjeldskov,et al.  Is It Worth the Hassle? Exploring the Added Value of Evaluating the Usability of Context-Aware Mobile Systems in the Field , 2004, Mobile HCI.

[2]  Stephen A. Brewster,et al.  Using Landmarks to Support Older People in Navigation , 2004, Mobile HCI.

[3]  Barbara Leporini,et al.  A Multimodal Mobile Museum Guide for All , 2007 .

[4]  Jennifer Lai,et al.  Facilitating mobile communication with multimodal access to email messages on a cell phone , 2004, CHI EA '04.

[5]  Ephraim P. Glinert,et al.  Multimodal Integration , 1996, IEEE Multim..

[6]  Trevor Darrell,et al.  MULTIMODAL INTERFACES THAT Flex, Adapt, and Persist , 2004 .

[7]  Stephen A. Brewster,et al.  Multimodal 'eyes-free' interaction techniques for wearable devices , 2003, CHI '03.

[8]  Jan Stage,et al.  Experimental Evaluation of Techniques for Usability Testing of Mobile Systems in a Laboratory Setting , 2003 .

[9]  Lynne Baillie,et al.  Exploring multimodality in the laboratory and the field , 2005, ICMI '05.

[10]  Leysia Palen,et al.  Beyond the handset: designing for wireless communications usability , 2002, TCHI.

[11]  Julio Abascal,et al.  Inclusive Design Guidelines for HCI , 2001 .

[12]  Topi Hurtig,et al.  A mobile multimodal dialogue system for public transportation navigation evaluated , 2006, Mobile HCI.

[13]  Luís Carriço,et al.  Defining scenarios for mobile design and evaluation , 2008, CHI Extended Abstracts.

[14]  G. Vanderheiden,et al.  The Graphical User Interface: Crisis, Danger, and Opportunity , 1990 .

[15]  Matthew Turk,et al.  Perceptual user interfaces (introduction) , 2000, CACM.

[16]  Stephen A. Brewster,et al.  Overcoming the Lack of Screen Space on Mobile Computers , 2002, Personal and Ubiquitous Computing.

[17]  Nadir Weibel,et al.  Paper-based mobile access to databases , 2006, SIGMOD Conference.

[18]  Sharon L. Oviatt,et al.  Mutual disambiguation of recognition errors in a multimodel architecture , 1999, CHI '99.

[19]  Matthew Turk,et al.  Perceptual user interfaces , 2000 .

[20]  Paul Blenkhorn,et al.  Using speech and touch to enable blind people to access schematic diagrams , 1998, J. Netw. Comput. Appl..

[21]  Luís Carriço,et al.  Designing Mobile Multimodal Artefacts , 2008, ICEIS.

[22]  Manfred Tscheligi,et al.  Joking, storytelling, artsharing, expressing affection: a field trial of how children and their social network communicate with digital images in leisure time , 2000, CHI.