The Relative Importance of Visual, Auditory, and Haptic Information for the User's Experience of Mechanical Switches

While the use of hand tools and other everyday manually controlled devices is naturally accompanied by multisensory feedback, the deployment of fully multimodal virtual interfaces requires that haptic, acoustic, and visual cues be synthesised. The complexity and character of this synthesis will depend on a thorough understanding of the multimodal perceptual experience, including the interrelations between the individual sensory channels during manual interaction. In this study seventy participants were asked to rank the manual operation of ten electromechanical switches according to preference. The participants were randomly assigned in groups of ten to one of seven sensory presentation conditions. These conditions comprised six bimodal and unimodal sensory combinations created by selectively restricting the flow of haptic, auditory, and visual information, plus one condition in which ful sensory information was available. A principal components analysis on the obtained ranking data indicated that the sensory conditions with unimpeded haptic information were clearly distinct from those in which the haptic cues were impeded. The analysis also showed that, for switch use, the unimodal haptic condition most closely approached the condition with combined haptic, auditory, and visual feedback, compared with all of the conditions where haptic feedback was restricted.

[1]  T B Sheridan,et al.  Teleoperator Performance with Varying Force and Visual Feedback , 1994, Human factors.

[2]  C. Spence,et al.  Visual Capture of Touch: Out-of-the-Body Experiences With Rubber Gloves , 2000, Psychological science.

[3]  K O Johnson,et al.  Tactile spatial resolution. I. Two-point discrimination, gap detection, grating resolution, and letter recognition. , 1981, Journal of neurophysiology.

[4]  H. Bülthoff,et al.  Merging the senses into a robust percept , 2004, Trends in Cognitive Sciences.

[5]  M Akamatsu,et al.  Please Scroll down for Article Ergonomics a Comparison of Tactile, Auditory, and Visual Feedback in a Pointing Task Using a Mouse-type Device , 2022 .

[6]  Barbara Deml,et al.  A Study on Visual, Auditory, and Haptic Feedback for Assembly Tasks , 2004, Presence: Teleoperators & Virtual Environments.

[7]  Heinrich H. Bülthoff,et al.  Integration of Sensory Information Within Touch and Across Modalities , 2004 .

[8]  D. H. Warren,et al.  Immediate perceptual response to intersensory discrepancy. , 1980, Psychological bulletin.

[9]  J. Gibson The Senses Considered As Perceptual Systems , 1967 .

[10]  Heinrich H. Bülthoff,et al.  Cross-modal perception of actively explored objects , 2003 .

[11]  Robert W. Lindeman,et al.  Effectiveness of directional vibrotactile cuing on a building-clearing task , 2005, CHI.

[12]  S. Shimojo,et al.  Illusions: What you see is what you hear , 2000, Nature.

[13]  P. Bertelson,et al.  Multisensory integration, perception and ecological validity , 2003, Trends in Cognitive Sciences.

[14]  S. Shimojo,et al.  Sensory modalities are not separate modalities: plasticity and interactions , 2001, Current Opinion in Neurobiology.

[15]  Mikko Sams,et al.  Factors influencing audiovisual fission and fusion illusions. , 2004, Brain research. Cognitive brain research.

[16]  P. Richard,et al.  Human perceptual issues in virtual environments: sensory substitution and information redundancy , 1995, Proceedings 4th IEEE International Workshop on Robot and Human Communication.

[17]  Herbert Stone,et al.  SENSORY EVALUATION | Descriptive Analysis , 2003 .