Psychoacoustic Evaluation of Systems for Delivering Spatialized Augmented-Reality Audio

Two new lightweight systems for delivering spatialized, augmented-reality audio (SARA) are presented. Each comprises a set of earphone drivers coupled with "acoustically transparent" earpieces and a digital filter. Using the first system, subjects were able to localize virtual auditory space (VAS) stimuli with the same accuracy as when using earphones that are standard for presentation of VAS, while free-field localization performance was reduced only slightly. The only disadvantage of this system is that it has a poor low-frequency response. VAS localization performance using the second system is also as good as that with standard VAS presentation earphones, though free-field localization performance is degraded to a greater extent. This system has good low-frequency response, however, so its range of uses complements that of the first SARA system. Both systems are light and easily constructed from unmodified, commercially available products. They require little digital signal processing overhead and no special preamplifier, so they are ideally suited to mobile applications.

[1]  F. N. Martin,et al.  Insert earphone depth and the occlusion effect. , 2000, American journal of audiology.

[2]  Henrik Møller Fundamentals of binaural technology , 1991 .

[3]  Tapio Lokki,et al.  Techniques and Applications of Wearable Augmented Reality Audio , 2003 .

[4]  Jyri Huopaniemi Future of Personal Audio: Smart Applications and Immersive Communication , 2007 .

[5]  Paula P. Henry,et al.  Spatial audio through a bone conduction interface , 2006, International journal of audiology.

[6]  J. C. Middlebrooks,et al.  Human sound localization at near-threshold levels , 2005, Hearing Research.

[7]  H. Steven Colburn,et al.  Role of spectral detail in sound-source localization , 1998, Nature.

[8]  Nicholas I. Fisher,et al.  Statistical Analysis of Spherical Data. , 1987 .

[9]  E. Langendijk,et al.  Fidelity of three-dimensional-sound reproduction using a virtual auditory display. , 2000, The Journal of the Acoustical Society of America.

[10]  J. Blauert Spatial Hearing: The Psychophysics of Human Sound Localization , 1983 .

[11]  Matti Karjalainen,et al.  AN AUGMENTED REALITY AUDIO HEADSET , 2008 .

[12]  Nicholas I. Fisher,et al.  Statistical Analysis of Spherical Data. , 1987 .

[13]  Matti Karjalainen,et al.  An Augmented Reality Audio Mixer and Equalizer , 2008 .

[14]  Russell L. Martin,et al.  Free-Field Equivalent Localization of Virtual Audio , 2001 .

[15]  Angelo Farina,et al.  Simultaneous Measurement of Impulse Response and Distortion with a Swept-Sine Technique , 2000 .

[16]  Tapio Lokki,et al.  Augmented reality audio for mobile and wearable appliances , 2004 .

[17]  Simon Carlile,et al.  Methods for spherical data analysis and visualization , 1998, Journal of Neuroscience Methods.

[18]  Michael Friis Sørensen,et al.  Head-Related Transfer Functions of Human Subjects , 1995 .

[19]  Simon Carlile,et al.  The nature and distribution of errors in sound localization by human listeners , 1997, Hearing Research.

[20]  Simon Carlile,et al.  Virtual Auditory Space: Generation and Applications , 2013, Neuroscience Intelligence Unit.

[21]  Frank Dellaert,et al.  SWAN: System for Wearable Audio Navigation , 2007, 2007 11th IEEE International Symposium on Wearable Computers.

[22]  Mendel Kleiner,et al.  Binaural bone-conducted sound in virtual environments: Evaluation of a portable, multimodal motion simulator prototype , 2008 .

[23]  N. Cliff Dominance statistics: Ordinal analyses to answer ordinal questions. , 1993 .