Sight, hearing and touch are the sensory modalities that play a dominating role in spatial perception in humans, i.e. the ability to recognize the geometrical structure of the surrounding environment, awareness of self-location in surrounding space and determining in terms of depth and directions the location of nearby objects. Information streams from these senses are continuously integrated and processed in the brain, so that a cognitive representation of the 3D environment can be accurately built whether stationary or in movement. Each of the three senses uses different cues for exploring the environment and features a different perception range (Hall, 1966). Touch provides information on the so called near space (termed also haptic space), whereas vision and hearing are capable of yielding percepts representing objects or events in the so called far space. Spatial orientation in terms of locating scene elements is the key capability allowing humans to interact with the surrounding environment, e.g. reaching objects, avoiding obstacles, wayfinding (Gollage, 1999) and determining own location with respect to the environment. An important aspect of locating objects in 3D space is the integration of percepts coming from different senses. Understanding distance to objects (depth perception) has been possible by concurrent binocular seeing and touching experience of near space objects (Millar, 1994). For locating and recognition of far space objects, vision and hearing cooperate in order to determine distance, bearings and the type of objects. The field of view of vision is limited to the space in front of the observer whereas hearing is ominidirectional and sound sources can be located even if occluded by other objects. Correct reproduction of sensory stimuli is important in virtual reality systems in which 3D vision based technologies are predominantly employed for creating immersive artificial environments. Many applications can greatly benefit from building acoustic 3D spaces (e.g. operators of complex control panels, in-field communication of combating soldiers or firemen). If such spaces are appropriately synthesized, perception capacity and immersion in the environment can be considerably enhanced (Castro, 2006). It has been also evidenced that if spatial instead of monophonic sounds are applied, the reaction time to acoustic stimuli becomes shorter and the listener is less prone to fatigue (Moore, 2004). Because of the enriched acoustic experience such devices offer (e.g. spaciousness and interactivity) they are frequently termed auditory display systems. Recently, such systems gain also in importance
[1]
Pawel Strumillo,et al.
Obstacle localization in 3D scenes from stereoscopic sequences
,
2007,
2007 15th European Signal Processing Conference.
[2]
E. Owens,et al.
An Introduction to the Psychology of Hearing
,
1997
.
[3]
Darius Burschka,et al.
Advances in Computational Stereo
,
2003,
IEEE Trans. Pattern Anal. Mach. Intell..
[4]
L. Kay,et al.
A sonar aid to enhance spatial perception of the blind: engineering design and evaluation
,
1974
.
[5]
Ryszard S. Choraś.
Image Processing and Communications Challenges 4 - 4th International Conference, IP&C 2012, Proceedings
,
2013,
IP&C.
[6]
Pawel Strumillo,et al.
Refinement of depth from stereo camera ego-motion parameters
,
2008
.
[7]
Masaharu Kato,et al.
The effect of head motion on the accuracy of sound localization
,
2003
.
[8]
R. Nicolson.
Auditory Scene Analysis: the Perceptual Organization of Sound, Albert S. Bregman. MIT Press (Bradford Book), London (1990), xiii + 772 pp. c. £50, ISBN: 0-262-02297-4
,
1991
.
[9]
R. Golledge.
Wayfinding Behavior: Cognitive Mapping and Other Spatial Processes
,
2010
.
[10]
N. Bourbakis,et al.
Sensing Surrounding 3-D Space for Navigation of the Blind
,
2008,
IEEE Engineering in Medicine and Biology Magazine.
[11]
M. Capp,et al.
The optophone: an electronic blind aid
,
2000
.
[12]
N. A. Bradley,et al.
Assistive Technology For Visually Impaired And Blind People
,
2008
.
[13]
Peter B. L. Meijer,et al.
An experimental system for auditory image representations
,
1992,
IEEE Transactions on Biomedical Engineering.
[14]
Anthony D. Heyes.
The Sonic Pathfinder: A New Electronic Travel Aid
,
1984
.
[15]
Joel D. Miller.
SLAB: A SOFTWARE-BASED REAL-TIME VIRTUAL ACOUSTIC ENVIRONMENT RENDERING SYSTEM
,
2001
.
[16]
M. Faruque,et al.
ELECTRONIC TRAVEL AID FOR THE BLIND
,
2001
.
[17]
Albert S. Bregman,et al.
The Auditory Scene. (Book Reviews: Auditory Scene Analysis. The Perceptual Organization of Sound.)
,
1990
.
[18]
José Mira,et al.
Engineering Applications of Bio-Inspired Artificial Neural Networks
,
1999,
Lecture Notes in Computer Science.
[19]
L. Kay.
An ultrasonic sensing probe as a mobility aid for the blind
,
1964
.
[20]
E. Hall,et al.
The Hidden Dimension
,
1970
.
[21]
Pawel Strumillo,et al.
Measurement System for Personalized Head-Related Transfer Functions and Its Verification by Virtual Source Localization Trials with Visually Impaired and Sighted Individuals
,
2010
.