Multisensory architectures for action-oriented perception

In order to solve the navigation problem of a mobile robot in an unstructured environment a versatile sensory system and efficient locomotion control algorithms are necessary. In this paper an innovative sensory system for action-oriented perception applied to a legged robot is presented. An important problem we address is how to utilize a large variety and number of sensors, while having systems that can operate in real time. Our solution is to use sensory systems that incorporate analog and parallel processing, inspired by biological systems, to reduce the required data exchange with the motor control layer. In particular, as concerns the visual system, we use the Eye-RIS v1.1 board made by Anafocus, which is based on a fully parallel mixed-signal array sensor-processor chip. The hearing sensor is inspired by the cricket hearing system and allows efficient localization of a specific sound source with a very simple analog circuit. Our robot utilizes additional sensors for touch, posture, load, distance, and heading, and thus requires customized and parallel processing for concurrent acquisition. Therefore a Field Programmable Gate Array (FPGA) based hardware was used to manage the multi-sensory acquisition and processing. This choice was made because FPGAs permit the implementation of customized digital logic blocks that can operate in parallel allowing the sensors to be driven simultaneously. With this approach the multi-sensory architecture proposed can achieve real time capabilities.

[1]  Anders Robertsson,et al.  Sensor fusion of force and acceleration for robot force control , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[2]  Luigi Fortuna,et al.  Turing Patterns in RD-CNNS for the Emergence of Perceptual States in Roving Robots , 2007, Int. J. Bifurc. Chaos.

[3]  Eugene M. Izhikevich,et al.  Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting , 2006 .

[4]  J. Fuster Cortex and mind : unifying cognition , 2003 .

[5]  Ren C. Luo,et al.  A tutorial on multisensor integration and fusion , 1990, [Proceedings] IECON '90: 16th Annual Conference of IEEE Industrial Electronics Society.

[6]  Paolo Arena,et al.  Different, biomimetic inspired walking machines controlled by a decentralised control approach relying on artificial neural networks , 2006 .

[7]  R.D. Quinn,et al.  A multi-sensory robot for testing biologically-inspired odor plume tracking strategies , 2005, Proceedings, 2005 IEEE/ASME International Conference on Advanced Intelligent Mechatronics..

[8]  Barbara Webb,et al.  A simple latency-dependent spiking-neuron model of cricket phonotaxis , 2000, Biological Cybernetics.

[9]  F. Figueroa,et al.  An architecture for intelligent systems based on smart sensors , 2004, Proceedings of the 21st IEEE Instrumentation and Measurement Technology Conference (IEEE Cat. No.04CH37510).

[10]  Paolo Arena,et al.  Walking capabilities of Gregor controlled through Walknet , 2007, SPIE Microtechnologies.

[11]  Chris Sullivan,et al.  Exploiting real-time FPGA based adaptive systems technology for real-time sensor fusion in next generation automotive safety systems , 2005, Design, Automation and Test in Europe.

[12]  P. Arena,et al.  Weak Chaos Control for Action-Oriented Perception: Real Time Implementation via FPGA , 2006, The First IEEE/RAS-EMBS International Conference on Biomedical Robotics and Biomechatronics, 2006. BioRob 2006..

[13]  Sunggyun Park,et al.  Development of a practical sensor fusion module for environment modeling , 2003, Proceedings 2003 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM 2003).

[14]  Hyeong-Soon Moon,et al.  Multi sensor data fusion for improving performance and reliability of fully automatic welding system , 2006 .

[15]  Richard E Reeve,et al.  New neural circuits for robot phonotaxis , 2003, Philosophical Transactions of the Royal Society of London. Series A: Mathematical, Physical and Engineering Sciences.