On event-based motion detection and integration

Event-based vision sensors sample individual pixels at a much higher temporal resolution and provide a representation of the visual input available in their receptive fields that is temporally independent of neighboring pixels. The information available on pixel level for subsequent processing stages is reduced to representations of changes in the local intensity function. In this paper we present theoretical implications of this condition with respect to the structure of light fields for stationary observers and local moving contrasts in the luminance function. On this basis we derive several constraints on what kind of information can be extracted from event-based sensory acquisition using the address-event-representation (AER) principle. We discuss how subsequent visual mechanisms can build upon such representations in order to integrate motion and static shape information. On this foundation we present approaches for motion detection and integration in a neurally inspired model that demonstrates the interaction of early and intermediate stages of visual processing. Results replicating experimental findings demonstrate the abilities of the initial and subsequent stages of the model in the domain of motion processing.

[1]  Tobias Brosch,et al.  Interaction of feedforward and feedback streams in visual cortex in a firing-rate model of columnar computations , 2014, Neural Networks.

[2]  D. Burr,et al.  Motion vision: Are ‘speed lines’ used in human visual motion? , 2000, Current Biology.

[3]  David J. Fleet,et al.  Performance of optical flow techniques , 1994, International Journal of Computer Vision.

[4]  Chiara Bartolozzi,et al.  Event-Based Visual Flow , 2014, IEEE Transactions on Neural Networks and Learning Systems.

[5]  Wilson S. Geisler,et al.  Motion streaks provide a spatial code for motion direction , 1999, Nature.

[6]  Heiko Neumann,et al.  Recurrent V1–V2 interaction in early visual boundary processing , 1999, Biological Cybernetics.

[7]  Ryad Benosman,et al.  Asynchronous Event-Based Hebbian Epipolar Geometry , 2011, IEEE Transactions on Neural Networks.

[8]  Heiko Neumann,et al.  Disambiguating Visual Motion Through Contextual Feedback Modulation , 2004, Neural Computation.

[9]  Leslie G. Ungerleider,et al.  ‘What’ and ‘where’ in the human brain , 1994, Current Opinion in Neurobiology.

[10]  Stephan Tschechne,et al.  Bio-Inspired Optic Flow from Event-Based Neuromorphic Sensor Input , 2014, ANNPR.

[11]  Arash Yazdanbakhsh,et al.  Seeing surfaces: The brain's vision of the world , 2007 .

[12]  E. Peterhans,et al.  Subjective contours - bridging the gap between psychophysics and physiology , 1991, Trends in Neurosciences.

[13]  E. Adelson,et al.  The Plenoptic Function and the Elements of Early Vision , 1991 .

[14]  Shih-Chii Liu,et al.  Neuromorphic sensory systems , 2010, Current Opinion in Neurobiology.

[15]  P. Roelfsema Cortical algorithms for perceptual grouping. , 2006, Annual review of neuroscience.

[16]  RussLL L. Ds Vnlos,et al.  SPATIAL FREQUENCY SELECTIVITY OF CELLS IN MACAQUE VISUAL CORTEX , 2022 .

[17]  Tobi Delbrück,et al.  A silicon early visual system as a model animal , 2004, Vision Research.

[18]  T. Delbruck,et al.  > Replace This Line with Your Paper Identification Number (double-click Here to Edit) < 1 , 2022 .

[19]  D. Bradley,et al.  Structure and function of visual area MT. , 2005, Annual review of neuroscience.

[20]  Heiko Neumann,et al.  A Model of Motion Transparency Processing with Local Center-Surround Interactions and Feedback , 2011, Neural Computation.

[21]  P. Lichtsteiner,et al.  Toward real-time particle tracking using an event-based dynamic vision sensor , 2011 .

[22]  Eugenio Culurciello,et al.  An Address-Event Fall Detector for Assisted Living Applications , 2008, IEEE Transactions on Biomedical Circuits and Systems.

[23]  Bernabe Linares-Barranco,et al.  On the use of orientation filters for 3D reconstruction in event-driven stereo vision , 2014, Front. Neurosci..

[24]  Chiara Bartolozzi,et al.  Asynchronous frameless event-based optical flow , 2012, Neural Networks.

[25]  Michael S. Pratte,et al.  How attention extracts objects from noise. , 2013, Journal of neurophysiology.

[26]  Tobi Delbrück,et al.  Asynchronous Event-Based Binocular Stereo Matching , 2012, IEEE Transactions on Neural Networks and Learning Systems.

[27]  J. Gibson The Ecological Approach to Visual Perception , 1979 .

[28]  E H Adelson,et al.  Spatiotemporal energy models for the perception of motion. , 1985, Journal of the Optical Society of America. A, Optics and image science.

[29]  Pierre Kornprobst,et al.  Neural Mechanisms of Motion Detection, Integration, and Segregation: From Biology to Artificial Image Processing Systems , 2011, EURASIP J. Adv. Signal Process..

[30]  D. Samuel Schwarzkopf,et al.  Direct evidence for encoding of motion streaks in human visual cortex , 2013, Proceedings of the Royal Society B: Biological Sciences.

[31]  Carver A. Mead,et al.  Neuromorphic electronic systems , 1990, Proc. IEEE.

[32]  Yiannis Aloimonos,et al.  Qualitative egomotion , 1995, International Journal of Computer Vision.

[33]  Ahmed Nabil Belbachir,et al.  Asynchronous Stereo Vision for Event-Driven Dynamic Stereo Sensor Using an Adaptive Cooperative Approach , 2013, 2013 IEEE International Conference on Computer Vision Workshops.

[34]  Takeo Kanade,et al.  An Iterative Image Registration Technique with an Application to Stereo Vision , 1981, IJCAI.

[35]  Derek H. Arnold,et al.  Motion-Induced Blindness and Motion Streak Suppression , 2009, Current Biology.