Implicit Estimation of Sound-arrival Time

I n perceiving the sound produced by the movement of a visible object, the brain coordinates the auditory and visual input 1–3 so that no delay is noticed even though the sound arrives later (for distant source objects, such as aircraft or firework displays, this is less effective). Here we show that coordination occurs because the brain uses information about distance that is supplied by the visual system to calibrate simultaneity. Our findings indicate that auditory and visual inputs are coordinated not because the brain has a wide temporal window for auditory integration, as was previously thought, but because the brain actively changes the temporal location of the window depending on the distance of the visible sound source. Seven subjects with normal vision and hearing were presented through headphones with a burst of white noise (90 decibels sound-pressure level, 10-ms duration, with 4-ms rise and fall times), the spectrum of which had been processed (by using head-related transfer functions) to simulate an external sound from a frontal direction. Brief light flashes (10 ms) were produced by an array of five green light-emitting diodes (LEDs) at different distances from the subjects (1–50 m; Fig. 1). The intensity of the light flash was 14.5 candelas per square metre at a viewing distance of 1 m, and was increased in proportion to the square of the viewing distance for the other distances in order to produce consistent intensity at the eye. The difference in onset times between the sound and light stimuli was varied randomly from ǁ125 ms to 175 ms in steps of 25 ms. Subjects were instructed to look at the centre of the LED array and to imagine that the LEDs were the source of both light and sound, while listening to the sound directly from the sound source. To eliminate possible bias effects, we used a two-alternative forced-choice task to measure subjective simultaneity: in this task, observers judged whether the light was presented before or after the sound. Twenty responses were obtained for each condition. To determine the stimulus-onset asynchrony that corresponded to subjective simultaneity, we estimated the 50% point (the point of subjective equality) by fitting a cumulative normal-distribution function to each individual's data using a maximum-likelihood curve-fitting technique. When the LED array was 1 m away, the point of subjective equality occurred at a sound delay of about 5 ms; however, the sound delay at this point …

[1]  R. D. Alexander The biology of moral systems , 1989 .

[2]  C. Boehm,et al.  Unto Others: The Evolution and Psychology of Unselfish Behavior , 1999 .

[3]  Mike Mesterton-Gibbons,et al.  Genetic and cultural evolution of cooperation , 2004 .

[4]  R. Rosenfeld Nature , 2009, Otolaryngology--head and neck surgery : official journal of American Academy of Otolaryngology-Head and Neck Surgery.

[5]  Michael Tomasello,et al.  Primate Cognition , 2010, Top. Cogn. Sci..