Crossmodal congruence: the look, feel and sound of touchscreen widgets

Our research considers the following question: how can visual, audio and tactile feedback be combined in a congruent manner for use with touchscreen graphical widgets? For example, if a touchscreen display presents different styles of visual buttons, what should each of those buttons feel and sound like? This paper presents the results of an experiment conducted to investigate methods of congruently combining visual and combined audio/tactile feedback by manipulating the different parameters of each modality. The results indicate trends with individual visual parameters such as shape, size and height being combined congruently with audio/tactile parameters such as texture, duration and different actuator technologies. We draw further on the experiment results using individual quality ratings to evaluate the perceived quality of our touchscreen buttons then reveal a correlation between perceived quality and crossmodal congruence. The results of this research will enable mobile touchscreen UI designers to create realistic, congruent buttons by selecting the most appropriate audio and tactile counterparts of visual button styles.

[1]  Stephen A. Brewster,et al.  Overcoming the Lack of Screen Space on Mobile Computers , 2002, Personal and Ubiquitous Computing.

[2]  Stephen A. Brewster,et al.  T-Bars: towards tactile user interfaces for mobile touchscreens , 2008, Mobile HCI.

[3]  Toshiaki Sugimura,et al.  Active click: tactile feedback for touch panels , 2001, CHI Extended Abstracts.

[4]  R B Welch,et al.  Effect of Degree of Separation of Visual-Auditory Stimulus and Eye Position upon Spatial Interaction of Vision and Audition , 1976, Perceptual and motor skills.

[5]  S. Bolanowski,et al.  Four channels mediate the mechanical aspects of touch. , 1988, The Journal of the Acoustical Society of America.

[6]  V. Jousmäki,et al.  Parchment-skin illusion: sound-biased touch , 1998, Current Biology.

[7]  Dimitris N. Metaxas,et al.  Feel the "fabric": an audio-haptic interface , 2003, SCA '03.

[8]  H. McGurk,et al.  Hearing lips and seeing voices , 1976, Nature.

[9]  Darren Leigh,et al.  Haptic pen: a tactile feedback stylus for touch screens , 2004, UIST '04.

[10]  S. Shimojo,et al.  Sensory modalities are not separate modalities: plasticity and interactions , 2001, Current Opinion in Neurobiology.

[11]  Stephen A. Brewster,et al.  Designing audio and tactile crossmodal icons for mobile devices , 2007, ICMI '07.

[12]  P. Laitinen,et al.  Enabling mobile haptic design: piezoelectric actuator technology properties in hand held devices , 2006, 2006 IEEE International Workshop on Haptic Audio Visual Environments and their Applications (HAVE 2006).

[13]  C. Spence,et al.  Multisensory perception: Beyond modularity and convergence , 2000, Current Biology.

[14]  Jun Rekimoto,et al.  Ambient touch: designing tactile interfaces for handheld devices , 2002, UIST '02.

[15]  Stephen Brewster,et al.  Mixed Feelings: Multimodal Perception of Virtual Roughness , 2002 .

[16]  Dale A. Lawrence,et al.  Performance trade-offs for hand controller design , 1994, Proceedings of the 1994 IEEE International Conference on Robotics and Automation.

[17]  D. Lewkowicz,et al.  The development of intersensory temporal perception: an epigenetic systems/limitations view. , 2000, Psychological bulletin.

[18]  W. A. Mvnso,et al.  Loudness , Its Definition , Measurement and Calculation , 2004 .

[19]  Stephen A. Brewster,et al.  Investigating the effectiveness of tactile feedback for mobile touchscreens , 2008, CHI.

[20]  Mandayam A. Srinivasan,et al.  Relevant Stimuli and their Relationships to Primate SA-I Mechanoreceptive Responses under Static Sinusoidal Indentation , 2006, MSV.