Towards Real-World Neurorobotics: Integrated Neuromorphic Visual Attention

Neuromorphic hardware and cognitive robots seem like an obvious fit, yet progress to date has been frustrated by a lack of tangible progress in achieving useful real-world behaviour. System limitations: the simple and usually proprietary nature of neuromorphic and robotic platforms, have often been the fundamental barrier. Here we present an integration of a mature “neuromimetic” chip, SpiNNaker, with the humanoid iCub robot using a direct AER - address-event representation - interface that overcomes the need for complex proprietary protocols by sending information as UDP-encoded spikes over an Ethernet link. Using an existing neural model devised for visual object selection, we enable the robot to perform a real-world task: fixating attention upon a selected stimulus. Results demonstrate the effectiveness of interface and model in being able to control the robot towards stimulus-specific object selection. Using SpiNNaker as an embeddable neuromorphic device illustrates the importance of two design features in a prospective neurorobot: universal configurability that allows the chip to be conformed to the requirements of the robot rather than the other way ’round, and standard interfaces that eliminate difficult low-level issues of connectors, cabling, signal voltages, and protocols. While this study is only a building block towards that goal, the iCub-SpiNNaker system demonstrates a path towards meaningful behaviour in robots controlled by neural network chips.

[1]  Johannes Schemmel,et al.  A location-independent direct link neuromorphic interface , 2013, The 2013 International Joint Conference on Neural Networks (IJCNN).

[2]  Michael A. Arbib,et al.  Handbook of Robotics, Neurorobotics: From Vision to Action , 2008 .

[3]  Yoram Ben-Shaul,et al.  Temporally precise cortical firing patterns are associated with distinct action segments. , 2006, Journal of neurophysiology.

[4]  Jim D. Garside,et al.  Overview of the SpiNNaker System Architecture , 2013, IEEE Transactions on Computers.

[5]  Michael A. Arbib,et al.  Neurorobotics: From Vision to Action , 2008, Springer Handbook of Robotics.

[6]  D Gamez,et al.  iSpike: a spiking neural interface for the iCub robot , 2012, Bioinspiration & biomimetics.

[7]  Bernabé Linares-Barranco,et al.  A Real-Time, Event-Driven Neuromorphic System for Goal-Directed Attentional Selection , 2012, ICONIP.

[8]  Angelo Cangelosi,et al.  Aquila 2.0 software architecture for cognitive robotics , 2013, 2013 IEEE Third Joint International Conference on Development and Learning and Epigenetic Robotics (ICDL).

[9]  Murray Shanahan,et al.  Training a spiking neural network to control a 4-DoF robotic arm based on Spike Timing-Dependent Plasticity , 2010, The 2010 International Joint Conference on Neural Networks (IJCNN).

[10]  Yasuo Kuniyoshi,et al.  Neural learning of embodied interaction dynamics , 1998, Neural Networks.

[11]  Giulio Sandini,et al.  The iCub humanoid robot: an open platform for research in embodied cognition , 2008, PerMIS.

[12]  G. Niemeyer,et al.  Springer Handbook of Robotics: Chapter 31 , 2008 .

[13]  Fabian Chersi,et al.  Learning Through Imitation: a Biological Approach to Robotics , 2012, IEEE Transactions on Autonomous Mental Development.

[14]  Yasuo Kuniyoshi,et al.  Embedded neural networks: exploiting constraints , 1998, Neural Networks.