Emergent effects in multimodal feedback from virtual buttons

The continued advancement in computer interfaces to support 3D tasks requires a better understanding of how users will interact with 3D user interfaces in a virtual workspace. This article presents two studies that investigated the effect of visual, auditory, and haptic sensory feedback modalities presented by a virtual button in a 3D environment on task performance (time on task and task errors) and user rating. Although we expected task performance to improve for conditions that combined two or three feedback modalities over a single modality, we instead found a significant emergent behavior that decreased performance in the trimodal condition. We found a significant increase in the number of presses when a user released the button before closing the virtual switch, suggesting that the combined visual, auditory, and haptic feedback led participants to prematurely believe they actuated a button. This suggests that in the design of virtual buttons, considering the effect of each feedback modality independently is not sufficient to predict performance, and unexpected effects may emerge when feedback modalities are combined.

[1]  Michael J. Fischer,et al.  The String-to-String Correction Problem , 1974, JACM.

[2]  Y. Guiard Asymmetric division of labor in human skilled bimanual action: the kinematic chain as a model. , 1987, Journal of motor behavior.

[3]  Linda R. Elliott,et al.  Comparing the effects of visual-auditory and visual-tactile feedback on user performance: a meta-analysis , 2006, ICMI '06.

[4]  M B Cooper,et al.  The effect of feedback on keying performance. , 1979, Applied ergonomics.

[5]  Vladimir I. Levenshtein,et al.  Binary codes capable of correcting deletions, insertions, and reversals , 1965 .

[6]  Alexander H. Waibel,et al.  Multimodal interfaces , 1996, Artificial Intelligence Review.

[7]  Louis Rosenberg,et al.  Using force feedback to enhance human performance in graphical user interfaces , 1996, CHI 1996.

[8]  M. Reiner,et al.  Sensory dominance in combinations of audio, visual and haptic stimuli , 2009, Experimental Brain Research.

[9]  M. A. Srinivassan The impact of visual information on the haptic perception of stiffness in virtual environments , 1996 .

[10]  Shumin Zhai,et al.  The performance of touch screen soft buttons , 2009, CHI.

[11]  Anatole Lécuyer,et al.  "Boundary of illusion": an experiment of sensory integration with a pseudo-haptic system , 2001, Proceedings IEEE Virtual Reality 2001.

[12]  Karon E. MacLean,et al.  Evaluation of haptically augmented touchscreen gui elements under cognitive load , 2007, ICMI '07.

[13]  Thomas Hempel,et al.  Multimodal User Interfaces: Designing Media for the Auditory and Tactile Channels , 2011 .

[14]  Mandayam A. Srinivasan,et al.  The Effect of Auditory Cues on the Haptic Perception of Stiffness in Virtual Environments , 1997, Dynamic Systems and Control.

[15]  James R. Lewis,et al.  Keys and Keyboards , 1997 .

[16]  Sharif Razzaque,et al.  Tactile virtual buttons for mobile devices , 2003, CHI Extended Abstracts.

[17]  Julie A. Jacko,et al.  Multimodal feedback: establishing a performance baseline for improved access by individuals with visual impairments , 2002, Assets '02.

[18]  F. Colavita Human sensory dominance , 1974 .

[19]  Stephen A. Brewster,et al.  Crossmodal icons for information display , 2006, CHI EA '06.

[20]  Ravin Balakrishnan,et al.  Haptic conviction widgets , 2009, Graphics Interface.

[21]  Thomas H. Massie,et al.  The PHANToM Haptic Interface: A Device for Probing Virtual Objects , 1994 .

[22]  K. Gegenfurtner,et al.  Design Issues in Gaze Guidance Under review with ACM Transactions on Computer Human Interaction , 2009 .

[23]  Andy Cockburn,et al.  Multimodal feedback for the acquisition of small targets , 2005, Ergonomics.

[24]  Jun Rekimoto,et al.  Ambient touch: designing tactile interfaces for handheld devices , 2002, UIST '02.

[25]  Steven van de Par,et al.  Auditory-visual interaction: from fundamental research in cognitive psychology to (possible) applications , 1999, Electronic Imaging.

[26]  Karen M. Cohen Membrane Keyboards and Human Performance , 1982 .

[27]  Stephen A. Brewster,et al.  Audio or tactile feedback: which modality when? , 2009, CHI.

[28]  John Long Effects of Delayed Irregular Feedback on Unskilled and Skilled Keying Performance , 1976 .

[29]  Stephen A. Brewster,et al.  T-Bars: towards tactile user interfaces for mobile touchscreens , 2008, Mobile HCI.

[30]  Christopher D. Wickens,et al.  Multiple resources and performance prediction , 2002 .

[31]  H. McGurk,et al.  Hearing lips and seeing voices , 1976, Nature.

[32]  Toshiaki Sugimura,et al.  Active click: tactile feedback for touch panels , 2001, CHI Extended Abstracts.

[33]  Stephen A. Brewster,et al.  Crossmodal congruence: the look, feel and sound of touchscreen widgets , 2008, ICMI '08.

[34]  Tyler Blake,et al.  Feedback and Key Discrimination on Membrane Keypads , 1984 .