Multi-Sensory-Motor Research: Investigating Auditory, Visual, and Motor Interaction in Virtual Reality Environments

Perception in natural environments is inseparably linked to motor action. In fact, we consider action an essential component of perceptual representation. But these representations are inherently difficult to investigate: Traditional experimental setups are limited by the lack of flexibility in manipulating spatial features. To overcome these problems, virtual reality (VR) experiments seem to be a feasible alternative, but these setups typically lack ecological realism due to the use of “unnatural” interface-devices (joystick). Thus, we propose an experimental apparatus which combines multisensory perception and action in an ecologically realistic way. The basis is a 10-foot hollow sphere (VirtuSphere) placed on a platform that allows free rotation. A subject inside can walk in any direction for any distance immersed into virtual environment. Both the rotation of the sphere and movement of the subject's head are tracked to process the subject's view within the VR-environment presented on a head-mounted display. Moreover, auditory features are dynamically processed taking greatest care of exact alignment of sound-sources and visual objects using ambisonic-encoded audio processed by a HRTF-filterbank. We present empirical data that confirm ecological realism of this setup and discuss its suitability for multi-sensory-motor research.