Integrating robotics and neuroscience: brains for robots, bodies for brains

Researchers in robotics and artificial intelligence have often looked at biology as a source of inspiration for solving their problems. From the opposite perspective, neuroscientists have recently turned their attention to the use of robotic systems as a way to quantitatively test and analyze theories that would otherwise remain at a speculative stage. Computational models of neurons and networks of neurons are often activated with simplified artificial patterns that bear little resemblance to natural stimuli. The use of robotic systems has the advantage of introducing phenotypic and environmental constraints similar to those that brains of animals have to face during development and in everyday life. Consideration of these constraints is particularly important in light of modern brain theories, which emphasize the importance of closing the perception/action loop between the agent and the environment. To provide concrete examples of the use of robotic systems in neuroscience, this paper reviews our work in the areas of sensory perception and motor learning. The interdisciplinary approach followed by this research establishes a direct link between natural sciences and engineering. This research can lead to the understanding of basic biological problems while producing robust and flexible systems that operate in the real world.

[1]  Hiroshi Ono,et al.  The rhino-optical phenomenon: Ocular parallax and the visible field beyond the nose , 1986, Vision Research.

[2]  Michele Rucci,et al.  Active estimation of distance in a robotic system that replicates human eye movement , 2007, Robotics Auton. Syst..

[3]  J. Gibson,et al.  Motion parallax as a determinant of perceived depth. , 1959, Journal of experimental psychology.

[4]  A. King,et al.  Auditory function: Neurobiological bases of hearing G.M. Edelman W.E. , 1990, Neuroscience.

[5]  Carver Mead,et al.  Analog VLSI and neural systems , 1989 .

[6]  Gerald M. Edelman,et al.  Auditory function : neurobiological bases of hearing , 1988 .

[7]  John D. Pettigrew,et al.  Convergence of specialised behaviour, eye movements and visual optics in the sandlance (Teleostei) and the chameleon (Reptilia) , 1999, Current Biology.

[8]  J. Rennie,et al.  David's victory. Gene causing "bubble boy" illness is finally found. , 1993, Scientific American.

[9]  R. Beer,et al.  Biorobotic approaches to the study of motor systems , 1998, Current Opinion in Neurobiology.

[10]  G Ishai,et al.  Visual stability and space perception in monocular vision: mathematical model. , 1980, Journal of the Optical Society of America.

[11]  M Konishi,et al.  Listening with two ears. , 1993, Scientific American.

[12]  R. D. Watkins,et al.  THE OPTICAL SYSTEM OF THE EYE , 1970 .

[13]  C. Gross,et al.  The representation of extrapersonal space: A possible role for bimodal, visual-tactile neurons , 1995 .

[14]  SantiniFabrizio,et al.  Active estimation of distance in a robotic system that replicates human eye movement , 2007 .

[15]  Barbara Webb,et al.  A Cricket Robot , 1996 .

[16]  Michael F. Land,et al.  Fast-focus telephoto eye , 1995, Nature.

[17]  Dana H. Ballard,et al.  Animate Vision , 1991, Artif. Intell..

[18]  J. Findlay Active vision: Visual activity in everyday life , 1998, Current Biology.

[19]  James S. Albus,et al.  Brains, behavior, and robotics , 1981 .

[20]  James L. McClelland,et al.  Autonomous Mental Development by Robots and Animals , 2001, Science.

[21]  E. Knudsen,et al.  Creating a unified representation of visual and auditory space in the brain. , 1995, Annual review of neuroscience.

[22]  Yiannis Aloimonos,et al.  Active vision , 2004, International Journal of Computer Vision.

[23]  G. Edelman,et al.  Behavioral constraints in the development of neuronal properties: a cortical model embedded in a real-world device. , 1998, Cerebral cortex.

[24]  Gerald M. Edelman,et al.  Robust localization of auditory and visual targets in a robotic barn owl , 2000, Robotics Auton. Syst..

[25]  Olaf Sporns,et al.  Neuromodulation and plasticity in an autonomous robot , 2002, Neural Networks.

[26]  Dagmar Sternad,et al.  Origins and Violations of the 2/3 Power Law in Rhythmic 3D Arm Movements , 2001 .

[27]  Svetha Venkatesh,et al.  From Living Eyes to Seeing Machines , 1997 .

[28]  W. Peddie,et al.  Helmholtz's Treatise on Physiological Optics , 1924, Nature.

[29]  Marian Stamp Dawkins,et al.  Pattern recognition and active vision in chickens , 2000, Nature.

[30]  S. Schaal,et al.  Origins and violations of the 2/3 power law in rhythmic three-dimensional arm movements , 2000, Experimental Brain Research.

[31]  Mandyam V. Srinivasan,et al.  Active vision in honeybees: Task-oriented suppression of an innate behaviour , 1994, Vision Research.

[32]  Gerald M. Edelman,et al.  Adaptation of orienting behavior: from the barn owl to a robotic system , 1999, IEEE Trans. Robotics Autom..

[33]  Michele Rucci,et al.  Theoretical study , 2022 .

[34]  Hermann von Helmholtz,et al.  Treatise on Physiological Optics , 1962 .

[35]  R. Bajcsy Active perception , 1988 .

[36]  Geoffrey P. Bingham,et al.  Optical flow from eye movement with head immobilized: “Ocular occlusion” beyond the nose , 1993, Vision Research.

[37]  B. Stein,et al.  The Merging of the Senses , 1993 .

[38]  M Rucci,et al.  Registration of Neural Maps through Value-Dependent Learning: Modeling the Alignment of Auditory and Visual Maps in the Barn Owl’s Optic Tectum , 1997, The Journal of Neuroscience.