A Neuro-Inspired Computational Model for a Visually Guided Robotic Lamprey Using Frame and Event Based Cameras

The computational load associated with computer vision is often prohibitive, and limits the capacity for on-board image analysis in compact mobile robots. Replicating the kind of feature detection and neural processing that animals excel at remains a challenge in most biomimetic aquatic robots. Event-driven sensors use a biologically inspired sensing strategy to eliminate the need for complete frame capture. Systems employing event-driven cameras enjoy reduced latencies, power consumption, bandwidth, and benefit from a large dynamic range. However, to the best of our knowledge, no work has been done to evaluate the performance of these devices in underwater robotics. This work proposes a robotic lamprey design capable of supporting computer vision, and uses this system to validate a computational neuron model for driving anguilliform swimming. The robot is equipped with two different types of cameras: frame-based and event-based cameras. These were used to stimulate the neural network, yielding goal-oriented swimming. Finally, a study is conducted comparing the performance of the computational model when driven by the two different types of camera. It was observed that event-based cameras improved the accuracy of swimming trajectories and led to significant improvements in the rate at which visual inputs were processed by the network.

[1]  Tobi Delbrück,et al.  Low-latency localization by active LED markers tracking using a dynamic vision sensor , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[2]  Julio Vega,et al.  Robot Evolutionary Localization Based on Attentive Visual Short-Term Memory , 2013, Sensors.

[3]  Long Wang,et al.  Underwater target following with a vision-based autonomous robotic fish , 2009, 2009 American Control Conference.

[4]  A. Ijspeert,et al.  From Swimming to Walking with a Salamander Robot Driven by a Spinal Cord Model , 2007, Science.

[5]  De Xu,et al.  Embedded Vision-Guided 3-D Tracking Control for Robotic Fish , 2016, IEEE Transactions on Industrial Electronics.

[6]  Gregory Dudek,et al.  A visual servoing system for an aquatic swimming robot , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[7]  Tobi Delbrück,et al.  A 128$\times$ 128 120 dB 15 $\mu$s Latency Asynchronous Temporal Contrast Vision Sensor , 2008, IEEE Journal of Solid-State Circuits.

[8]  Auke Jan Ijspeert,et al.  Central pattern generators for locomotion control in animals and robots: A review , 2008, Neural Networks.

[9]  Qiang Huang,et al.  An overview of biomimetic robots with animal behaviors , 2019, Neurocomputing.

[10]  C. Michael Wagner,et al.  Behavioral responses of sea lamprey (Petromyzon marinus) to a putative alarm cue derived from conspecific and heterospecific sources , 2012 .

[11]  Douglas P. Chivers,et al.  Learning about Danger: Chemical Alarm Cues and Threat‐Sensitive Assessment of Predation Risk by Fishes , 2011 .

[12]  Ouahiba Azouaoui,et al.  Bio-inspired visual memory for robot cognitive map building and scene recognition , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[13]  Sten Grillner,et al.  Tectal microcircuit generating visual selection commands on gaze-controlling neurons , 2015, Proceedings of the National Academy of Sciences.

[14]  Nalin Harischandra,et al.  A computational model of visually guided locomotion in lamprey , 2012, Biological Cybernetics.

[15]  S. Schaal,et al.  Robotics and Neuroscience , 2014, Current Biology.

[16]  James M. Bower,et al.  Hodgkin–Huxley Models , 2009 .

[17]  Jianwei Zhang,et al.  A Survey on CPG-Inspired Control Models and System Implementation , 2014, IEEE Transactions on Neural Networks and Learning Systems.

[18]  Alessandro Crespi,et al.  Envirobot: A bio-inspired environmental monitoring platform , 2016, 2016 IEEE/OES Autonomous Underwater Vehicles (AUV).

[19]  T. Delbruck,et al.  > Replace This Line with Your Paper Identification Number (double-click Here to Edit) < 1 , 2022 .

[20]  F. Engert,et al.  Visuomotor Transformations Underlying Hunting Behavior in Zebrafish , 2015, Current Biology.

[21]  Flavio Fontana,et al.  Towards evasive maneuvers with quadrotors using dynamic vision sensors , 2015, 2015 European Conference on Mobile Robots (ECMR).

[22]  Eric I. Knudsen,et al.  Global Inhibition and Stimulus Competition in the Owl Optic Tectum , 2010, The Journal of Neuroscience.

[23]  Cesare Stefanini,et al.  Encoding lateralization of jump kinematics and eye use in a locust via bio-robotic artifacts , 2019, Journal of Experimental Biology.

[24]  R. Passingham The hippocampus as a cognitive map J. O'Keefe & L. Nadel, Oxford University Press, Oxford (1978). 570 pp., £25.00 , 1979, Neuroscience.

[25]  John B Hume,et al.  A death in the family: Sea lamprey (Petromyzon marinus) avoidance of confamilial alarm cues diminishes with phylogenetic distance , 2018, Ecology and evolution.

[26]  Paolo Dario,et al.  A bioinspired autonomous swimming robot as a tool for studying goal-directed locomotion , 2013, Biological Cybernetics.

[27]  Wulfram Gerstner,et al.  Neuronal Dynamics: From Single Neurons To Networks And Models Of Cognition , 2014 .