Visual-Auditory Redirection: Multimodal Integration of Incongruent Visual and Auditory Cues for Redirected Walking

In this paper, we present a study of redirected walking (RDW) that shifts the positional relationship between visual and auditory cues during curvature manipulation. It has been shown that, when presented with incongruent visual and auditory spatial cues during a localization task, human observers integrate that information based on each cue’s relative reliability, which determines their final perception of the target object’s location. This multi-modal integration model is known as maximum likelihood estimation (MLE). By altering the visual location of objects that users perceive in virtual reality (VR) through auditory cues during redirection manipulation, we expect fewer users to notice the manipulation, which helps increase the usable curvature gain. Most existing studies on MLE in multi-modal integration have used random-dot stereograms as visual cues under stable motion states. In the present study, we first investigated whether this model holds while walking in VR environment. Our results indicate that in a walking state, users’ perceptions of the target object’s location shift toward auditory cue as the reliability of vision decreases, in keeping with the trend shown in previous studies on MLE. Based on this result, we then investigated the detection threshold of curvature gains during redirection manipulation under a condition with congruent visual-auditory cues as well as a condition in which users’ location perceptions of the target object are considered to be affected by the incongruent auditory cue. We found that the detection threshold of curvature gains was higher with incongruent visual-auditory cues than with congruent cues. These results show that incongruent multimodal cues in VR may have a promising application in the area of redirected walking.

[1]  J. Pokorny Foundations of Cyclopean Perception , 1972 .

[2]  James J. Clark,et al.  Data Fusion for Sensory Information Processing Systems , 1990 .

[3]  Frank Boland,et al.  Distance Perception in Virtual Audio-Visual Environments , 2012 .

[4]  Klaus H. Hinrichs,et al.  Taxonomy and Implementation of Redirection Techniques for Ubiquitous Passive Haptic Feedback , 2008, 2008 International Conference on Cyberworlds.

[5]  Marc O. Ernst,et al.  A Bayesian view on multimodal cue integration , 2006 .

[6]  Zachary Wartell,et al.  Leveraging change blindness for redirection in virtual environments , 2011, 2011 IEEE Virtual Reality Conference.

[7]  M. Posner,et al.  Visual dominance: an information-processing account of its origins and significance. , 1976, Psychological review.

[8]  Stefania Serafin,et al.  Estimation of detection thresholds for acoustic based redirected walking techniques , 2013, 2013 IEEE Virtual Reality (VR).

[9]  Robert A Jacobs,et al.  Bayesian integration of visual and auditory signals for spatial localization. , 2003, Journal of the Optical Society of America. A, Optics, image science, and vision.

[10]  Luv Kohli,et al.  Redirected touching: Warping space to remap passive haptics , 2010, 2010 IEEE Symposium on 3D User Interfaces (3DUI).

[11]  Abderrahmane Kheddar,et al.  Pseudo-haptic feedback: can isometric input devices simulate force feedback? , 2000, Proceedings IEEE Virtual Reality 2000 (Cat. No.00CB37048).

[12]  M. Ernst,et al.  Optimal integration of shape information from vision and touch , 2007, Experimental Brain Research.

[13]  Sharif Razzaque,et al.  Redirected Walking , 2001, Eurographics.

[14]  D. H. Warren,et al.  Sensory conflict in judgments of spatial direction , 1969 .

[15]  Eyal Ofek,et al.  Haptic Retargeting: Dynamic Repurposing of Passive Haptics for Enhanced Virtual Reality Experiences , 2016, CHI.

[16]  Mary C. Whitton,et al.  15 Years of Research on Redirected Walking in Immersive Virtual Environments , 2018, IEEE Computer Graphics and Applications.

[17]  Stefania Serafin,et al.  Estimation of detection thresholds for audiovisual rotation gains , 2016, 2016 IEEE Virtual Reality (VR).

[18]  Takuji Narumi,et al.  Unlimited corridor: redirected walking techniques using visuo haptic interaction , 2016, SIGGRAPH Emerging Technologies.

[19]  Takuji Narumi,et al.  Modifying Perceived Size of a Handled Object through Hand Image Deformation , 2013, PRESENCE: Teleoperators and Virtual Environments.

[20]  D. Burr,et al.  The Ventriloquist Effect Results from Near-Optimal Bimodal Integration , 2004, Current Biology.

[21]  Markus Lappe,et al.  Subliminal Reorientation and Repositioning in Immersive Virtual Environments using Saccadic Suppression , 2015, IEEE Transactions on Visualization and Computer Graphics.

[22]  Mar Gonzalez-Franco,et al.  Concurrent talking in immersive virtual reality: on the dominance of visual speech cues , 2017, Scientific Reports.

[23]  S. Shimojo,et al.  Illusions: What you see is what you hear , 2000, Nature.

[24]  P. O. Bishop,et al.  Can random-dot stereograms serve as a model for the perception of depth in relation to real three-dimensional objects? , 1996, Vision Research.

[25]  Gerd Bruder,et al.  Estimation of Detection Thresholds for Redirected Walking Techniques , 2010, IEEE Transactions on Visualization and Computer Graphics.

[26]  A. Woods,et al.  Context Modulates the Contribution of Time and Space in Causal Inference , 2012, Front. Psychology.

[27]  Bernhard J. M. Hess,et al.  Self-motion-induced eye movements: effects on visual acuity and navigation , 2005, Nature Reviews Neuroscience.

[28]  M. Ernst,et al.  Humans integrate visual and haptic information in a statistically optimal fashion , 2002, Nature.

[29]  Takuji Narumi,et al.  Modifying an identified curved surface shape using pseudo-haptic effect , 2012, 2012 IEEE Haptics Symposium (HAPTICS).

[30]  Gerd Bruder,et al.  Cognitive Resource Demands of Redirected Walking , 2015, IEEE Transactions on Visualization and Computer Graphics.