The Roles of Spatial Auditory Perception and Cognition in the Accessibility of a Game Map with a First Person View

The work presented here is the first part of a global research on auditory navigation. It has been particularily focused on navigation with a first person view, and tries to highlight and study critical aspects of usability of 3-D Audio technology in this particular context. An experiment has been conducted where forty subjects had to find nine sound sources in a virtual town, navigating by using spatialized auditory cues that were delivered differently in four conditions: by a binaural versus a stereophonic (through headphones) combined by contextualized versus decontextualized beacons. A decontextualized beacon uses a sound indicating the azimuth of a target while a contextualized beacon uses a sound indicating the shortest path toward the target. Behavioral data, auto-evaluation of cognitive load and subjective-impression data collected via a questionnaire were recorded. As expected, using binaural or contextualized beacons improves the orientation task by enhancing the performance of dynamic localization and correlatively reduces player's workload. However, contextualized beacons (using either binaural or stereophonic rendering) was not as relevant as expected for navigation itself, failing to reduce the reliance on physical space.