VRsneaky: Increasing Presence in VR Through Gait-Aware Auditory Feedback

While Virtual Reality continues to increase in fidelity, it remains an open question how to effectively reflect the user's movements and provide congruent feedback in virtual environments. We present VRsneaky, a system for producing auditory movement feedback, which helps participants orient themselves in a virtual environment by providing footstep sounds. The system reacts to the user's specific gait features and adjusts the audio accordingly. In a user study with 28 participants, we found that VRsneaky increases users' sense of presence as well as awareness of their own posture and gait. Additionally, we find that increasing auditory realism significantly influences certain characteristics of participants' gait. Our work shows that gait-aware audio feedback is a means to increase presence in virtual environments. We discuss opportunities and design requirements for future scenarios where users walk through immersive virtual worlds.

[1]  Ankylosing Spondylitis,et al.  THE JOURNAL OF BONE AND JOINT SURGERY , 2006 .

[2]  Maureen K. Holden,et al.  Virtual Environments for Motor Rehabilitation: Review , 2005, Cyberpsychology Behav. Soc. Netw..

[3]  Stephen A. Brewster,et al.  Object Manipulation in Virtual Reality Under Increasing Levels of Translational Gain , 2018, CHI.

[4]  Jennifer L. Campos,et al.  Vection and visually induced motion sickness: how are they related? , 2015, Front. Psychol..

[5]  Hannes Kaufmann,et al.  A Hybrid Sound Model for 3D Audio Games with Real Walking , 2016, CASA.

[6]  J. J. Higgins,et al.  The aligned rank transform for nonparametric factorial analyses using only anova procedures , 2011, CHI.

[7]  Luca Turchet,et al.  Physically Based Sound Synthesis and Control of Footsteps Sounds , 2010 .

[8]  Concepción Perpiñá,et al.  Body Image and Virtual Reality in Eating Disorders: Is Exposure to Virtual Reality More Effective than the Classical Body Image Treatment? , 1999, Cyberpsychology Behav. Soc. Netw..

[9]  Michael J. Singer,et al.  Measuring Presence in Virtual Environments: A Presence Questionnaire , 1998, Presence.

[10]  Ann Blandford,et al.  Qualitative HCI Research: Going Behind the Scenes , 2016, Synthesis Lectures on Human-Centered Informatics.

[11]  Ana Tajadura-Jiménez,et al.  As Light as your Footsteps: Altering Walking Sounds to Change Perceived Body Weight, Emotional State and Gait , 2015, CHI.

[12]  Antonios Liapis,et al.  Modelling Affect for Horror Soundscapes , 2019, IEEE Transactions on Affective Computing.

[13]  Eike Langbehn,et al.  Scale & Walk: Evaluation von skalierungsbasierten Interaktionstechniken zur natürlichen Fortbewegung in VR , 2018, Mensch & Computer.

[14]  M. Slater Immersion and the illusion of presence in virtual reality. , 2018, British journal of psychology.

[15]  Ting Zhang,et al.  Using decision trees to measure activities in people with stroke , 2013, 2013 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC).

[16]  Maria V. Sanchez-Vives,et al.  From presence to consciousness through virtual reality , 2005, Nature Reviews Neuroscience.

[17]  Mel Slater,et al.  Taking steps: the influence of a walking technique on presence in virtual reality , 1995, TCHI.

[18]  Michael J. Singer,et al.  The Factor Structure of the Presence Questionnaire , 2005, Presence: Teleoperators & Virtual Environments.

[19]  Luca Turchet,et al.  Extraction of ground reaction forces for real-time synthesis of walking sounds , 2009 .

[20]  Ville Pulkki,et al.  Proximity of Surfaces — Acoustic and Perceptual Effects , 2017 .

[21]  Sang-Youn Kim,et al.  RealWalk: Feeling Ground Surfaces While Walking in Virtual Reality , 2018, CHI Extended Abstracts.

[22]  Sally A. Linkenauger,et al.  Multimodal contributions to body representation , 2016 .

[23]  A. B. Drought,et al.  WALKING PATTERNS OF NORMAL MEN. , 1964, The Journal of bone and joint surgery. American volume.

[24]  Chris Schmandt,et al.  MetaSpace II: Object and full-body tracking for interaction and navigation in social VR , 2015, ArXiv.

[25]  Mel Slater,et al.  Enhancing Our Lives with Immersive Virtual Reality , 2016, Front. Robot. AI.

[26]  Chris Schmandt,et al.  MetaSpace: Full-body Tracking for Immersive Multiperson Virtual Reality , 2015, UIST.

[27]  Stefania Serafin,et al.  Sound Design to Enhance Presence in Photorealistic Virtual Reality , 2004, ICAD.

[28]  Mel Slater,et al.  Drumming in Immersive Virtual Reality: The Body Shapes the Way We Play , 2013, IEEE Transactions on Visualization and Computer Graphics.