Asynchronous Event-Based Motion Processing: From Visual Events to Probabilistic Sensory Representation

In this work, we propose a two-layered descriptive model for motion processing from retina to the cortex, with an event-based input from the asynchronous time-based image sensor (ATIS) camera. Spatial and spatiotemporal filtering of visual scenes by motion energy detectors has been implemented in two steps in a simple layer of a lateral geniculate nucleus model and a set of three-dimensional Gabor kernels, eventually forming a probabilistic population response. The high temporal resolution of independent and asynchronous local sensory pixels from the ATIS provides a realistic stimulation to study biological motion processing, as well as developing bio-inspired motion processors for computer vision applications. Our study combines two significant theories in neuroscience: event-based stimulation and probabilistic sensory representation. We have modeled how this might be done at the vision level, as well as suggesting this framework as a generic computational principle among different sensory modalities.

[1]  David Whitney,et al.  Motion-Dependent Representation of Space in Area MT+ , 2013, Neuron.

[2]  Tim Gollisch,et al.  Rapid Neural Coding in the Retina with Relative Spike Latencies , 2008, Science.

[3]  Michael J. Berry,et al.  Anticipation of moving stimuli by the retina , 1999, Nature.

[4]  Tobias Brosch,et al.  On event-based motion detection and integration , 2014, BICT.

[5]  Tobias Brosch,et al.  On event-based optical flow detection , 2015, Front. Neurosci..

[6]  Daniel Matolin,et al.  An asynchronous time-based image sensor , 2008, 2008 IEEE International Symposium on Circuits and Systems.

[7]  B. Knight,et al.  Response variability and timing precision of neuronal spike trains in vivo. , 1997, Journal of neurophysiology.

[8]  D. Hubel,et al.  Receptive fields, binocular interaction and functional architecture in the cat's visual cortex , 1962, The Journal of physiology.

[9]  Guillaume S. Masson,et al.  The Flash-Lag Effect as a Motion-Based Predictive Shift , 2017, PLoS Comput. Biol..

[10]  W. Pitts,et al.  What the Frog's Eye Tells the Frog's Brain , 1959, Proceedings of the IRE.

[11]  Chun-I Yeh,et al.  Temporal precision in the neural code and the timescales of natural vision , 2007, Nature.

[12]  Michael Isard,et al.  CONDENSATION—Conditional Density Propagation for Visual Tracking , 1998, International Journal of Computer Vision.

[13]  Chiara Bartolozzi,et al.  Event-Based Visual Flow , 2014, IEEE Transactions on Neural Networks and Learning Systems.

[14]  E. Chichilnisky,et al.  Precision of spike trains in primate retinal ganglion cells. , 2004, Journal of neurophysiology.

[15]  Chiara Bartolozzi,et al.  Asynchronous frameless event-based optical flow , 2012, Neural Networks.

[16]  Guillaume S. Masson,et al.  Motion-Based Prediction Is Sufficient to Solve the Aperture Problem , 2012, Neural Computation.

[17]  Yoonsuck Choe,et al.  Extrapolative Delay Compensation Through Facilitating Synapses and Its Relation to the Flash-Lag Effect , 2008, IEEE Transactions on Neural Networks.

[18]  Edward H. Adelson,et al.  Motion illusions as optimal percepts , 2002, Nature Neuroscience.

[19]  J. Kaas,et al.  A representation of the visual field in the caudal third of the middle tempral gyrus of the owl monkey (Aotus trivirgatus). , 1971, Brain research.

[20]  R. Reid,et al.  Temporal Coding of Visual Information in the Thalamus , 2000, The Journal of Neuroscience.

[21]  Alan L. Yuille,et al.  Probabilistic Motion Estimation Based on Temporal Coherence , 2000, Neural Computation.

[22]  Gregor Schöner,et al.  Shorter latencies for motion trajectories than for flashes in population responses of cat primary visual cortex , 2004, The Journal of physiology.

[23]  Anders Lansner,et al.  Signature of an anticipatory response in area VI as modeled by a probabilistic model and a spiking neural network , 2014, 2014 International Joint Conference on Neural Networks (IJCNN).

[24]  R. Reid,et al.  Precise Firing Events Are Conserved across Neurons , 2002, The Journal of Neuroscience.

[25]  S. Zeki,et al.  Response properties and receptive fields of cells in an anatomically defined region of the superior temporal sulcus in the monkey. , 1971, Brain research.

[26]  T. Başar,et al.  A New Approach to Linear Filtering and Prediction Problems , 2001 .

[27]  E H Adelson,et al.  Spatiotemporal energy models for the perception of motion. , 1985, Journal of the Optical Society of America. A, Optics and image science.

[28]  Michael J. Berry,et al.  The structure and precision of retinal spike trains. , 1997, Proceedings of the National Academy of Sciences of the United States of America.

[29]  Wolfram Erlhagen,et al.  Internal models for visual perception , 2003, Biological Cybernetics.

[30]  T. Shibasaki,et al.  Retinal ganglion cells act largely as independent encoders , 2001 .

[31]  N. Petkov,et al.  Motion detection, noise reduction, texture suppression, and contour enhancement by spatiotemporal Gabor filters with surround inhibition , 2007, Biological Cybernetics.

[32]  D. Hubel,et al.  Receptive fields and functional architecture of monkey striate cortex , 1968, The Journal of physiology.