Assessment of Binaural–Proprioceptive Interaction in Human-Machine Interfaces

Binaural models help to predict human localization under the assumption that a corresponding localization process is based on acoustic signals, thus, on unimodal information. However, what happens if this localization process is realized in an environment with available bimodal or even multimodal sensory input? Do we still consider the auditory modality in the localization process? Can binaural models help to predict human localization in bimodal or multimodal scenes? At the beginning, this chapter focuses on binaural-visual localization and demonstrates that binaural models are definitely required for modeling human localization even when visual information is available. The main part of this chapter dedicates to binaural-proprioceptive localization. First, an experiment is described with which the proprioceptive localization performance was quantitatively measured. Second, the influence of binaural signals on proprioception was investigated to reveal whether synthetically generated spatial sound can improve human proprioceptive localization. The results demonstrate that it is indeed possible to auditorily guide proprioception. In conclusion, binaural models can not only be used for modeling human binaural-visual, but also for modeling human binaural-proprioceptive localization. It is shown that binaural-modeling algorithms, thus, play an important role for further technical developments.

[1]  Radu Horaud,et al.  Audio-Visual Clustering for 3D Speaker Localization , 2008, MLMI.

[2]  Allison M. Okamura,et al.  Methods for haptic feedback in teleoperated robot-assisted surgery , 2004 .

[3]  Marjorie Darrah,et al.  Haptic Display of Mathematical Functions for Teaching Mathematics to Students with Vision Disabilities: Design and Proof of Concept , 2000, Haptic Human-Computer Interaction.

[4]  A. Berkhout,et al.  Acoustic control by wave field synthesis , 1993 .

[5]  Adam Faeth,et al.  Combining 3-D geovisualization with force feedback driven user interaction , 2008, GIS '08.

[6]  Wen Qi Geometry based haptic interaction with scientific data , 2006, VRCIA '06.

[7]  Helen Petrie,et al.  Use of a haptic device by blind and sighted people : perception of virtual textures and objects , 1998 .

[8]  Richard Lowe,et al.  Seeing with the Hands and with the Eyes: The Contributions of Haptic Cues to Anatomical Shape Recognition in Surgery , 2010, AAAI Spring Symposium: Cognitive Shape Processing.

[9]  Hugo Fastl,et al.  Psychoacoustics: Facts and Models , 1990 .

[10]  Lynette A. Jones,et al.  Tactile and Haptic Illusions , 2011, IEEE Transactions on Haptics.

[11]  L A JEFFRESS,et al.  A place theory of sound localization. , 1948, Journal of comparative and physiological psychology.

[12]  R. Klatzky,et al.  Haptic identification of common objects: Effects of constraining the manual exploration process , 2004, Perception & psychophysics.

[13]  Ian Oakley,et al.  Putting the feel in ’look and feel‘ , 2000, CHI.

[14]  T. Haidegger,et al.  Surgery in space: the future of robotic telesurgery , 2011, Surgical Endoscopy.

[15]  M. Ernst,et al.  Humans integrate visual and haptic information in a statistically optimal fashion , 2002, Nature.

[16]  R. J. Irwin,et al.  Differential thresholds for limb movement measured using adaptive techniques , 1992, Perception & psychophysics.

[17]  D. McCloskey,et al.  Detections of movements imposed on finger, elbow and shoulder joints. , 1983, The Journal of physiology.

[18]  Robert Baumgartner,et al.  Assessment of Sagittal-Plane Sound Localization Performance in Spatial-Audio Applications , 2013 .

[19]  F. J. Clark,et al.  HOW ACCURATELY CAN WE PERCEIVE THE POSITIONS OF OUR LIMBS , 1992 .

[20]  D. Wolpert,et al.  When Feeling Is More Important Than Seeing in Sensorimotor Adaptation , 2002, Current Biology.

[21]  Charlotte Magnusson,et al.  Audio haptic tools for navigation in non visual environments , 2005 .

[22]  G. Sepulveda-Cervantes,et al.  Haptic cues for effective learning in 3D maze navigation , 2008, 2008 IEEE International Workshop on Haptic Audio visual Environments and Games.

[23]  Antonio Frisoli,et al.  A new option for the visually impaired to experience 3D art at museums: manual exploration of virtual copies , 2003 .

[24]  Stephen Brewster,et al.  Using non-speech sound to overcome information overload , 1997 .

[25]  Robert L. Williams,et al.  The implementation and evaluation of a virtual haptic back , 2003, Virtual Reality.

[26]  Jonas Braasch,et al.  Binaural signal processing , 2011, 2011 17th International Conference on Digital Signal Processing (DSP).

[27]  S. Shimojo,et al.  Illusions: What you see is what you hear , 2000, Nature.

[28]  Mathieu Bernard,et al.  Binaural Systems in Robotics , 2013 .

[29]  Bill Gardner,et al.  HRTF Measurements of a KEMAR Dummy-Head Microphone , 1994 .

[30]  Jonas Braasch,et al.  Modelling of Binaural Hearing , 2005 .

[31]  Hong Z. Tan,et al.  HUMAN FACTORS FOR THE DESIGN OF FORCE-REFLECTING HAPTIC INTERFACES , 1994 .

[32]  Ville Pulkki,et al.  Virtual Sound Source Positioning Using Vector Base Amplitude Panning , 1997 .

[33]  David G. Malham,et al.  3-D Sound Spatialization using Ambisonic Techniques , 1995 .

[34]  Marco Weiss,et al.  Open-source Projects , 2007 .

[35]  D. Burr,et al.  The Ventriloquist Effect Results from Near-Optimal Bimodal Integration , 2004, Current Biology.

[36]  Charlotte Magnusson,et al.  A virtual traffic environment for people with visual impairments , 2005 .