A Platform for Robotics Research Based on the Remote-Brained Robot Approach

Robots of the future will act in the real world. To realize robots with the common sense to do this will, it is generally believed, require massively parallel processing. This is a problem for those of us who want to do experiments with robots in the real world today—it is hard to build active, limber, situated robots when they have to carry along heavy brains. Our answer is “remote-brained robots.” A remote-brained robot is designed to have the brain and body separate, both conceptually and physically. It allows us to tie artificial intelligence (AI) with massive parallelism directly to the real world, enabling the verification of high-level AI techniques, which could previously only be used in simulation. Once the brain is placed remotely from the body, it encourages us to investigate many important research topics of embodied agents working in the real world. This paper describes such platform for future robotics and principal experiments done on the platform.

[1]  Masayuki Inaba,et al.  Design of real-time large scale robot software platform and its implementation in the remote-brained robot project , 1996, Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems. IROS '96.

[2]  Karl Sims,et al.  Evolving 3d morphology and behavior by competition , 1994 .

[3]  Lu Xu,et al.  Distributed Garbage Collection for the Parallel Inference Engine PIE64 , 1989, NACLP.

[4]  Hirochika Inoue,et al.  Two-armed bipedal robot that can walk, roll over and stand up , 1995, Proceedings 1995 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human Robot Interaction and Cooperative Robots.

[5]  Masayuki Inaba,et al.  Vision-based adaptive and interactive behaviors in mechanical animals using the remote-brained approach , 1994, Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS'94).

[6]  Masayuki Inaba,et al.  Rope handling by mobile hand-eye robots , 1993 .

[7]  H. Miura,et al.  Dynamical walk of biped locomotion , 1983 .

[8]  Shin'ichi Yuta,et al.  Implementation of a Small Size Experimental Self-Contained Autonomous Robot-Sensors, Vehicle Control, and Description of Sensor Based Behavior , 1991, ISER.

[9]  Masayuki Inaba,et al.  Design and implementation of a system that generates assembly programs from visual recognition of human action sequences , 1990, EEE International Workshop on Intelligent Robots and Systems, Towards a New Frontier of Applications.

[10]  Ian Horswill,et al.  Specialization of perceptual processes , 1993 .

[11]  Shigeki Sugano,et al.  Autonomic Limb Control of the Information Processing Robot , 1985 .

[12]  Masayuki Inaba,et al.  Real-time vision-based control of swing motion by a human-form robot using the remote-brained approach , 1996, Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems. IROS '96.

[13]  I. Kato Information-power machine with senses and limbs (Wabot 1) , 1974 .

[14]  Takeo Kanade,et al.  Vision and Navigation for the Carnegie-Mellon Navlab , 1987 .

[15]  Masayuki Inaba,et al.  A full-body tactile sensor suit using electrically conductive fabric and strings , 1996, Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems. IROS '96.

[16]  Hirochika Inoue,et al.  Building a Bridge Between AI and Robotics , 1985, IJCAI.

[17]  Glen Speckert,et al.  Hand Eye Coordination , 1976 .

[18]  Masayuki Inaba,et al.  Vision-based adaptive and interactive behaviors in mechanical animals using the remote-brained approach , 1996, Robotics Auton. Syst..

[19]  Masayuki INABA,et al.  Hand Eye Coordination in Rope Handling , 1985 .

[20]  Masayuki Inaba,et al.  EusLisp: an object-based implementation of Lisp , 1991 .

[21]  Masayuki Inaba,et al.  A stereo viewer based on a single camera with view-control mechanisms , 1993, Proceedings of 1993 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS '93).

[22]  Masayuki Inaba,et al.  A 35 DOF humanoid that can coordinate arms and legs in standing up, reaching and grasping an object , 1996, Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems. IROS '96.

[23]  Hirochika Inoue,et al.  Vision Based Robot Behavior: Tools and Testbeds for Real-World AI Research , 1993, IJCAI.

[24]  Masayuki Inaba,et al.  Robot vision system with a correlation chip for real-time tracking, optical flow and depth map generation , 1992, Proceedings 1992 IEEE International Conference on Robotics and Automation.

[25]  Masayuki Inaba,et al.  Vision-equipped apelike robot based on the remote-brained approach , 1995, Proceedings of 1995 IEEE International Conference on Robotics and Automation.

[26]  Masayuki Inaba,et al.  Remote-Brained Robotics : Interfacing AI with Real World Behaviors , 1993 .

[27]  Masayuki Inaba,et al.  Mother operations to evolve embodied robots based on the remote-brained approach , 1997 .

[28]  Rodney A. Brooks,et al.  Building brains for bodies , 1995, Auton. Robots.

[29]  Masayuki Inaba,et al.  Vision-based multisensor integration in remote-brained robots , 1994, Proceedings of 1994 IEEE International Conference on MFI '94. Multisensor Fusion and Integration for Intelligent Systems.

[30]  Masayuki Inaba,et al.  Development of a two-armed bipedal robot that can walk and carry objects , 1996, Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems. IROS '96.

[31]  Yasuo Kuniyoshi,et al.  Qualitative Recognition of Ongoing Human Action Sequences , 1993, IJCAI.

[32]  Martin Nilsson,et al.  FLENG Prolog - The Language which turns Supercomputers into Parallel Prolog Machines , 1986, LP.