Preliminary Experiment Combining Virtual Reality Haptic Shoes and Audio Synthesis

We describe a system that provides combined auditory and haptic sensations to simulate walking on different grounds. It uses a physical model that drives haptic transducers embedded in sandals and headphones. The model represents walking interactions with solid surfaces that can creak, or be covered with crumpling material. In a preliminary discrimination experiment, 15 participants were asked to recognize four different surfaces in a list of sixteen possibilities and under three different conditions, haptics only, audition only and combined haptic-audition. The results indicate that subjects are able to recognize most of the stimuli in the audition only condition, and some of the material properties such as hardness in the haptics only condition. The combination of auditory and haptic cues did not improve recognition significantly.

[1]  K. H. Hunt,et al.  Coefficient of Restitution Interpreted as Damping in Vibroimpact , 1975 .

[2]  S. Lederman Auditory Texture Perception , 1979, Perception.

[3]  Vincent Hayward,et al.  The pantograph: a large workspace haptic device for multimodal human computer interaction , 1994, CHI '94.

[4]  Control Division,et al.  Proceedings of the ASME Dynamic Systems and Control Division , 1996 .

[5]  Mandayam A. Srinivasan,et al.  The Effect of Auditory Cues on the Haptic Perception of Stiffness in Virtual Environments , 1997, Dynamic Systems and Control.

[6]  Perry R. Cook,et al.  Physically Informed Sonic Modeling (PhISM): Synthesis of percussive sounds , 1997 .

[7]  V. Jousmäki,et al.  Parchment-skin illusion: sound-biased touch , 1998, Current Biology.

[8]  Dinesh K. Pai,et al.  Contact Interaction with Integrated Audio and Haptics , 2000 .

[9]  Dinesh K. Pai,et al.  Scanning physical interaction behavior of 3D objects , 2001, SIGGRAPH.

[10]  Davide Rocchesso,et al.  MODELING COLLISION SOUNDS: NON-LINEAR CONTACT FORCE , 2001 .

[11]  S. Shimojo,et al.  Sensory modalities are not separate modalities: plasticity and interactions , 2001, Current Opinion in Neurobiology.

[12]  Vincent Hayward,et al.  Single state elastoplastic friction models , 2002, IEEE Trans. Autom. Control..

[13]  M. Ernst,et al.  Feeling what you hear: auditory signals can modulate tactile tap perception , 2005, Experimental Brain Research.

[14]  Jörg Krüger,et al.  HapticWalker---a novel haptic foot device , 2005, TAP.

[15]  Davide Rocchesso,et al.  Interactive Simulation of rigid body interaction with friction-induced sound generation , 2005, IEEE Transactions on Speech and Audio Processing.

[16]  Dahai Li,et al.  Haptic Shoes: Representing Information By Vibration , 2005, APVIS.

[17]  Hiroo Iwata,et al.  Powered shoes , 2006, SIGGRAPH '06.

[18]  Federico Avanzini,et al.  Integrating physically based sound models in a multimodal rendering architecture , 2006, Comput. Animat. Virtual Worlds.

[19]  Vincent Hayward,et al.  Do it yourself haptics: part I , 2007, IEEE Robotics & Automation Magazine.

[20]  Anatole Lécuyer,et al.  Using an event-based approach to improve the multimodal rendering of 6DOF virtual contact , 2007, VRST '07.

[21]  Bruno L. Giordano,et al.  Non‐visual identification of walking grounds , 2008 .

[22]  R. Velazquez,et al.  On-shoe tactile display , 2008, 2008 IEEE International Workshop on Haptic Audio visual Environments and Games.

[23]  Jeremy R. Cooperstock,et al.  Touch Is Everywhere: Floor Surfaces as Ambient Haptic Interfaces , 2009, IEEE Transactions on Haptics.

[24]  岩井 大輔 IEEE VR 2010 , 2010 .

[25]  Luca Turchet,et al.  Sound synthesis and evaluation of interactive footsteps for virtual reality applications , 2010, 2010 IEEE Virtual Reality Conference (VR).