Neuromorphic Approach Sensitivity Cell Modeling and FPGA Implementation

Neuromorphic engineering takes inspiration from biology to solve engineering problems using the organizing principles of biological neural computation. This field has demonstrated success in sensor based applications (vision and audition) as well in cognition and actuators. This paper is focused on mimicking an interesting functionality of the retina that is computed by one type of Retinal Ganglion Cell (RGC). It is the early detection of approaching (expanding) dark objects. This paper presents the software and hardware logic FPGA implementation of this approach sensitivity cell. It can be used in later cognition layers as an attention mechanism. The input of this hardware modeled cell comes from an asynchronous spiking Dynamic Vision Sensor, which leads to an end-to-end event based processing system. The software model has been developed in Java, and computed with an average processing time per event of 370 ns on a NUC embedded computer. The output firing rate for an approaching object depends on the cell parameters that represent the needed number of input events to reach the firing threshold. For the hardware implementation on a Spartan6 FPGA, the processing time is reduced to 160 ns/event with the clock running at 50 MHz.

[1]  Rava Azeredo da Silveira,et al.  Approach sensitivity in the retina processed by a multifunctional neural circuit , 2009, Nature Neuroscience.

[2]  T. Delbruck,et al.  > Replace This Line with Your Paper Identification Number (double-click Here to Edit) < 1 , 2022 .

[3]  Daniel Gutierrez-Galan,et al.  A 20Mevps/32Mev event-based USB framework for neuromorphic systems debugging , 2016, 2016 Second International Conference on Event-based Control, Communication, and Signal Processing (EBCCSP).

[4]  Tobi Delbrück,et al.  Retinal ganglion cell software and FPGA model implementation for object detection and tracking , 2016, 2016 IEEE International Symposium on Circuits and Systems (ISCAS).

[5]  Bernabé Linares-Barranco,et al.  An AER handshake-less modular infrastructure PCB with x8 2.5Gbps LVDS serial links , 2014, 2014 IEEE International Symposium on Circuits and Systems (ISCAS).

[6]  Tobi Delbrück,et al.  A 128$\times$ 128 120 dB 15 $\mu$s Latency Asynchronous Temporal Contrast Vision Sensor , 2008, IEEE Journal of Solid-State Circuits.

[7]  Eugenio Culurciello,et al.  Activity-driven, event-based vision sensors , 2010, Proceedings of 2010 IEEE International Symposium on Circuits and Systems.

[8]  Ryad Benosman,et al.  Asynchronous visual event-based time-to-contact , 2014, Front. Neurosci..

[9]  Tobi Delbrück,et al.  CAVIAR: A 45k Neuron, 5M Synapse, 12G Connects/s AER Hardware Sensory–Processing– Learning–Actuating System for High-Speed Visual Object Recognition and Tracking , 2009, IEEE Transactions on Neural Networks.

[10]  Steve B. Furber,et al.  Real-Time Interface Board for Closed-Loop Robotic Tasks on the SpiNNaker Neural Computing System , 2013, ICANN.

[11]  Tobi Delbrück,et al.  A USB3.0 FPGA event-based filtering and tracking framework for dynamic vision sensors , 2015, 2015 IEEE International Symposium on Circuits and Systems (ISCAS).

[12]  Luis A. Plana,et al.  SpiNNaker: Mapping neural networks onto a massively-parallel chip multiprocessor , 2008, 2008 IEEE International Joint Conference on Neural Networks (IEEE World Congress on Computational Intelligence).