Interaction between the listener and their environment in a spatial auditory display plays an important role in creating better situational awareness, resolving front/back and up/down confusions, and improving localization. Prior studies with 6DOF interaction suggest that using either a head tracker or a mouse-driven interface yields similar performance during a navigation and search task in a virtual auditory environment. In this paper, we present a study that compares listener performance in a virtual auditory environment under a static mode condition, and two dynamic conditions (head tracker and mouse) using orientation-only interaction. Results reveal tradeoffs among the conditions and interfaces. While the fastest response time was observed in the static mode, both dynamic conditions resulted in significantly reduced front/back confusions and improved localization accuracy. Training effects and search strategies are discussed.
[1]
Gregory H. Wakefield,et al.
Use of Interfaces in an Auditory Environment during a Navigation and Search Task
,
2010
.
[2]
Gregory H. Wakefield,et al.
Effects of Interface Type on Navigation in a Virtual Spatial Auditory Environment
,
2010
.
[3]
Gregory H. Wakefield,et al.
Moving Sound Source Synthesis for Binaural Electroacoustic Music Using Interpolated Head-Related Transfer Functions (HRTFs)
,
2001,
Computer Music Journal.
[4]
H. Wallach,et al.
The role of head movements and vestibular and visual cues in sound localization.
,
1940
.
[5]
F L Wightman,et al.
Resolution of front-back ambiguity in spatial hearing by listener and source movement.
,
1999,
The Journal of the Acoustical Society of America.