Sonic Interaction Design to enhance presence and motion in virtual environments

An occurring problem of the image-based-rendering technology for Virtual Environments has been that subjects in general showed very little movement of head and body since only visual stimulus was provided. By transferring information from film studies and current practice, practitioners emphasize that auditory feedback such as sound of footsteps signifies the character giving them weight and thereby subjecting the audience to interpretation of embodiment. We hypothesize that the movement rate can be significantly enhanced by introducing auditory feedback. In the described study, 126 subjects participated in a between-subjects experiment involving six different experimental conditions, including both uni and bi-modal stimuli (auditory and visual). The aim of the study was to investigate the influence of auditory rendering in stimulating and enhancing subjects motion in virtual reality. The auditory stimuli consisted of several combinations of auditory feedback, including static sound sources as well as self-induced sounds. Results show that subjects’ motion in virtual reality is significantly enhanced when dynamic sound sources and sound of egomotion are rendered in the environment.