The Role of 3-D Sound in Human Reaction and Performance in Augmented Reality Environments

Three-dimensional sound's effectiveness in virtual reality (VR) environments has been widely studied. However, due to the big differences between VR and augmented reality (AR) systems in registration, calibration, perceptual difference of immersiveness, navigation, and localization, it is important to develop new approaches to seamlessly register virtual 3-D sound in AR environments and conduct studies on 3-D sound's effectiveness in AR context. In this paper, we design two experimental AR environments to study the effectiveness of 3-D sound both quantitatively and qualitatively. Two different tracking methods are applied to retrieve the 3-D position of virtual sound sources in each experiment. We examine the impacts of 3-D sound on improving depth perception and shortening task completion time. We also investigate its impacts on immersive and realistic perception, different spatial objects identification, and subjective feeling of "human presence and collaboration". Our studies show that applying 3-D sound is an effective way to complement visual AR environments. It helps depth perception and task performance, and facilitates collaborations between users. Moreover, it enables a more realistic environment and more immersive feeling of being inside the AR environment by both visual and auditory means. In order to make full use of the intensity cues provided by 3-D sound, a process to scale the intensity difference of 3-D sound at different depths is designed to cater small AR environments. The user study results show that the scaled 3-D sound significantly increases the accuracy of depth judgments and shortens the searching task completion time. This method provides a necessary foundation for implementing 3-D sound in small AR environments. Our user study results also show that this process does not degrade the intuitiveness and realism of an augmented audio reality environment

[1]  Hideo Makino,et al.  Development of navigation system for the blind using GPS and mobile phone combination , 1996, Proceedings of 18th Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[2]  Matthias M. Wloka,et al.  Resolving occlusion in augmented reality , 1995, I3D '95.

[3]  Woodrow Barfield,et al.  Visual and Auditory Localization as a Function of Azimuth and Elevation , 1997 .

[4]  Michael Cohen,et al.  Augmented audio reality: telepresence/VR hybrid acoustic environments , 1993, Proceedings of 1993 2nd IEEE International Workshop on Robot and Human Communication.

[5]  Michael Gervautz,et al.  Occlusion in collaborative augmented environments , 1999, Comput. Graph..

[6]  Ron Frederick,et al.  Audio aura: light-weight audio augmented reality , 1997, UIST '97.

[7]  Naokazu Yokoya,et al.  Geometric and photometric registration for real-time augmented reality , 2002, Proceedings. International Symposium on Mixed and Augmented Reality.

[8]  Fumio Kishino,et al.  Augmented reality: a class of displays on the reality-virtuality continuum , 1995, Other Conferences.

[9]  Russell L. Martin,et al.  Aurally and Visually Guided Visual Search in a Virtual Environment , 1998, Hum. Factors.

[10]  Steven K. Feiner,et al.  Windows on the world: 2D windows for 3D augmented reality , 1993, UIST '93.

[11]  Bill Gardner,et al.  HRTF Measurements of a KEMAR Dummy-Head Microphone , 1994 .

[12]  Thurlow Wr,et al.  Further study of existence regions for the "ventriloquism effect". , 1976 .

[13]  W. Thurlow,et al.  Further study of existence regions for the "ventriloquism effect". , 1976, Journal of the American Audiology Society.

[14]  Scott H. Foster,et al.  A Virtual Display System for Conveying Three-Dimensional Acoustic Information , 1988 .

[15]  William W. Gaver,et al.  Effective sounds in complex systems: the ARKOLA simulation , 1991, CHI.

[16]  W. M. Rabinowitz,et al.  Auditory localization of nearby sources. Head-related transfer functions. , 1999, The Journal of the Acoustical Society of America.

[17]  A. Takagi,et al.  Development of a stereo video see-through HMD for AR systems , 2000, Proceedings IEEE and ACM International Symposium on Augmented Reality (ISAR 2000).

[18]  Ivan Poupyrev,et al.  Augmented Groove: Collaborative Jamming in Augmented Reality , 2000, SIGGRAPH 2000.

[19]  Hirokazu Kato,et al.  Marker tracking and HMD calibration for a video-based augmented reality conferencing system , 1999, Proceedings 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR'99).

[20]  Mark Billinghurst,et al.  A wearable spatial conferencing space , 1998, Digest of Papers. Second International Symposium on Wearable Computers (Cat. No.98EX215).

[21]  William W. Gaver Auditory Icons: Using Sound in Computer Interfaces , 1986, Hum. Comput. Interact..

[22]  Naokazu Yokoya,et al.  A stereo vision-based augmented reality system with an inertial sensor , 2000, Proceedings IEEE and ACM International Symposium on Augmented Reality (ISAR 2000).

[23]  Gerhard Eckel,et al.  Immersive audio-augmented environments: the LISTEN project , 2001, Proceedings Fifth International Conference on Information Visualisation.

[24]  Suya You,et al.  Natural Feature Tracking for Augmented Reality , 1999, IEEE Trans. Multim..

[25]  D. Brungart Auditory localization of nearby sources. III. Stimulus effects. , 1999, The Journal of the Acoustical Society of America.

[26]  Ivan E. Sutherland,et al.  A head-mounted three dimensional display , 1968, AFIPS Fall Joint Computing Conference.

[27]  Durand R. Begault,et al.  3-D Sound for Virtual Reality and Multimedia Cambridge , 1994 .

[28]  Chris Schmandt,et al.  Audio hallway: a virtual acoustic environment for browsing , 1998, UIST '98.

[29]  Ronald Azuma,et al.  A Survey of Augmented Reality , 1997, Presence: Teleoperators & Virtual Environments.