Variations in blur are present in retinal images of scenes containing objects at multiple depth planes. Here we examine whether neural representations of image blur can be recalibrated as a function of depth. Participants were exposed to textured images whose blur changed with depth in a novel manner. For one group of participants, image blur increased as the images moved closer; for the other group, blur increased as the images moved away. A comparison of post-test versus pre-test performances on a blur-matching task at near and far test positions revealed that both groups of participants showed significant experience-dependent recalibration of the relationship between depth and blur. These results demonstrate that blur adaptation is conditioned by 3D viewing contexts.
[1]
M Nawrot,et al.
Neural integration of information specifying structure from stereopsis and motion.
,
1989,
Science.
[2]
Michael A. Webster,et al.
Neural adjustments to image blur
,
2002,
Nature Neuroscience.
[3]
Zijiang J. He,et al.
Surfaces versus features in visual search
,
1992,
Nature.
[4]
Erik Blaser,et al.
Color-specific depth mechanisms revealed by a color-contingent depth aftereffect
,
2000,
Vision Research.
[6]
B. Julesz,et al.
Independent Spatial-Frequency-Tuned Channels in Binocular Fusion and Rivalry
,
1975
.
[7]
Erik Blaser,et al.
The conjunction of feature and depth information
,
2002,
Vision Research.