A Bayesian Model of Sensory Adaptation

Recent studies reported two opposite types of adaptation in temporal perception. Here, we propose a Bayesian model of sensory adaptation that exhibits both types of adaptation. We regard adaptation as the adaptive updating of estimations of time-evolving variables, which determine the mean value of the likelihood function and that of the prior distribution in a Bayesian model of temporal perception. On the basis of certain assumptions, we can analytically determine the mean behavior in our model and identify the parameters that determine the type of adaptation that actually occurs. The results of our model suggest that we can control the type of adaptation by controlling the statistical properties of the stimuli presented.

[1]  Kazuyuki Aihara,et al.  Bayesian Inference Explains Perception of Unity and Ventriloquism Aftereffect: Identification of Common Sources of Audiovisual Stimuli , 2007, Neural Computation.

[2]  Katsumi Watanabe,et al.  Realignment of temporal simultaneity between vision and touch , 2008, Neuroreport.

[3]  Eli Brenner,et al.  Sensory integration does not lead to sensory calibration , 2006, Proceedings of the National Academy of Sciences.

[4]  S. Kitazawa,et al.  Bayesian calibration of simultaneity in tactile temporal order judgment , 2006, Nature Neuroscience.

[5]  Eero P. Simoncelli,et al.  Noise characteristics and prior expectations in human visual speed perception , 2006, Nature Neuroscience.

[6]  Kazuyuki Aihara,et al.  Integrative Bayesian model on two opposite types of sensory adaptation , 2009, Artificial Life and Robotics.

[7]  G. Westheimer,et al.  A shift in the perceived simultaneity of adjacent visual stimuli following adaptation to stroboscopic motion along the same axis , 1985, Vision Research.

[8]  Reza Shadmehr,et al.  Learning of action through adaptive combination of motor primitives , 2000, Nature.

[9]  Konrad Paul Kording,et al.  Decision Theory: What "Should" the Nervous System Do? , 2007, Science.

[10]  Konrad Paul Kording,et al.  Bayesian integration in sensorimotor learning , 2004, Nature.

[11]  Konrad Paul Kording,et al.  Relevance of error: what drives motor adaptation? , 2009, Journal of neurophysiology.

[12]  L. Harris,et al.  The effect of exposure to asynchronous audio, visual, and tactile stimulus combinations on the perception of simultaneity , 2008, Experimental Brain Research.

[13]  Konrad Paul Kording,et al.  Estimating the sources of motor errors for adaptation and generalization , 2008, Nature Neuroscience.

[14]  A. Dale,et al.  Visual motion aftereffect in human cortical area MT revealed by functional magnetic resonance imaging , 1995, Nature.

[15]  S. Nishida,et al.  Recalibration of audiovisual simultaneity , 2004, Nature Neuroscience.

[16]  Yôiti Suzuki,et al.  Implicit estimation of sound-arrival time , 2003, Nature.

[17]  R. Blake,et al.  Neural strength of visual attention gauged by motion adaptation , 1999, Nature Neuroscience.

[18]  Konrad Paul Kording,et al.  Causal Inference in Multisensory Perception , 2007, PloS one.

[19]  Charles Spence,et al.  Adaptation to audiotactile asynchrony , 2007, Neuroscience Letters.

[20]  M. Miyazaki,et al.  Testing Bayesian models of human coincidence timing. , 2005, Journal of neurophysiology.

[21]  Konrad Paul Kording,et al.  The dynamics of memory as a consequence of optimal adaptation to a changing body , 2007, Nature Neuroscience.

[22]  P. Bertelson,et al.  Recalibration of temporal order perception by exposure to audio-visual asynchrony. , 2004, Brain research. Cognitive brain research.