Approaching Retinal Ganglion Cell Modeling and FPGA Implementation for Robotics

Taking inspiration from biology to solve engineering problems using the organizing principles of biological neural computation is the aim of the field of neuromorphic engineering. This field has demonstrated success in sensor based applications (vision and audition) as well as in cognition and actuators. This paper is focused on mimicking the approaching detection functionality of the retina that is computed by one type of Retinal Ganglion Cell (RGC) and its application to robotics. These RGCs transmit action potentials when an expanding object is detected. In this work we compare the software and hardware logic FPGA implementations of this approaching function and the hardware latency when applied to robots, as an attention/reaction mechanism. The visual input for these cells comes from an asynchronous event-driven Dynamic Vision Sensor, which leads to an end-to-end event based processing system. The software model has been developed in Java, and computed with an average processing time per event of 370 ns on a NUC embedded computer. The output firing rate for an approaching object depends on the cell parameters that represent the needed number of input events to reach the firing threshold. For the hardware implementation, on a Spartan 6 FPGA, the processing time is reduced to 160 ns/event with the clock running at 50 MHz. The entropy has been calculated to demonstrate that the system is not totally deterministic in response to approaching objects because of several bioinspired characteristics. It has been measured that a Summit XL mobile robot can react to an approaching object in 90 ms, which can be used as an attentional mechanism. This is faster than similar event-based approaches in robotics and equivalent to human reaction latencies to visual stimulus.

[1]  Ryad Benosman,et al.  Asynchronous visual event-based time-to-contact , 2014, Front. Neurosci..

[2]  Chiara Bartolozzi,et al.  What Can Neuromorphic Event-Driven Precise Timing Add to Spike-Based Pattern Recognition? , 2015, Neural Computation.

[3]  Bo Zhao,et al.  Bag of Events: An Efficient Probability-Based Feature Extraction Method for AER Image Sensors , 2017, IEEE Transactions on Neural Networks and Learning Systems.

[4]  Giacomo Indiveri,et al.  Obstacle avoidance and target acquisition in mobile robots equipped with neuromorphic sensory-processing systems , 2017, 2017 IEEE International Symposium on Circuits and Systems (ISCAS).

[5]  Daniel Gutierrez-Galan,et al.  A 20Mevps/32Mev event-based USB framework for neuromorphic systems debugging , 2016, 2016 Second International Conference on Event-based Control, Communication, and Signal Processing (EBCCSP).

[6]  Hui Wei,et al.  A Mathematical Model of Retinal Ganglion Cells and Its Applications in Image Representation , 2013, Neural Processing Letters.

[7]  Tobi Delbrück,et al.  Retinal ganglion cell software and FPGA model implementation for object detection and tracking , 2016, 2016 IEEE International Symposium on Circuits and Systems (ISCAS).

[8]  Tobi Delbrück,et al.  A 128$\times$ 128 120 dB 15 $\mu$s Latency Asynchronous Temporal Contrast Vision Sensor , 2008, IEEE Journal of Solid-State Circuits.

[9]  Luis A. Plana,et al.  SpiNNaker: Mapping neural networks onto a massively-parallel chip multiprocessor , 2008, 2008 IEEE International Joint Conference on Neural Networks (IEEE World Congress on Computational Intelligence).

[10]  Tobi Delbrück,et al.  A USB3.0 FPGA event-based filtering and tracking framework for dynamic vision sensors , 2015, 2015 IEEE International Symposium on Circuits and Systems (ISCAS).

[11]  Georg Wiesmann,et al.  Event-driven feature analysis in a 4D spatiotemporal representation for ambient assisted living , 2011, 2011 IEEE International Conference on Computer Vision Workshops (ICCV Workshops).

[12]  Eugenio Culurciello,et al.  Activity-driven, event-based vision sensors , 2010, Proceedings of 2010 IEEE International Symposium on Circuits and Systems.

[13]  Tobi Delbrück,et al.  CAVIAR: A 45k Neuron, 5M Synapse, 12G Connects/s AER Hardware Sensory–Processing– Learning–Actuating System for High-Speed Visual Object Recognition and Tracking , 2009, IEEE Transactions on Neural Networks.

[14]  Giorgio Metta,et al.  Event-driven visual attention for the humanoid robot iCub , 2013, Front. Neurosci..

[15]  Nitish V. Thakor,et al.  A Saccade Based Framework for Real-Time Motion Segmentation Using Event Based Vision Sensors , 2017, Front. Neurosci..

[16]  Steve B. Furber,et al.  Real-Time Interface Board for Closed-Loop Robotic Tasks on the SpiNNaker Neural Computing System , 2013, ICANN.

[17]  T. Delbruck,et al.  > Replace This Line with Your Paper Identification Number (double-click Here to Edit) < 1 , 2022 .

[18]  Bernabé Linares-Barranco,et al.  An AER handshake-less modular infrastructure PCB with x8 2.5Gbps LVDS serial links , 2014, 2014 IEEE International Symposium on Circuits and Systems (ISCAS).

[19]  Tobi Delbrück,et al.  Neuromorphic Approach Sensitivity Cell Modeling and FPGA Implementation , 2017, ICANN.

[20]  Juan López Coronado,et al.  Neuro-Inspired Spike-Based Motion: From Dynamic Vision Sensor to Robot Motor Open-Loop Control through Spike-VITE , 2013, Sensors.

[21]  Rava Azeredo da Silveira,et al.  Approach sensitivity in the retina processed by a multifunctional neural circuit , 2009, Nature Neuroscience.

[22]  J. A. Pruszynski,et al.  Stimulus‐locked responses on human arm muscles reveal a rapid neural pathway linking visual input to arm motor output , 2010, The European journal of neuroscience.

[23]  Young-Im Cho,et al.  A 2.5D Map-Based Mobile Robot Localization via Cooperation of Aerial and Ground Robots , 2017, Sensors.