Vision-based adaptive and interactive behaviors in mechanical animals using the remote-brained approach

We present a variety of vision-based adaptive and interactive behaviors in mechanical animals. The mechanical animal is a multi-legged robot designed as a remote-brained robots which does not bring its own brain within the body. It leaves the brain in the mother environment and talks with it by radio links. The brain is raised in the mother environment inherited over generations. The key idea of the remote-brained approach is that of interfacing intelligent software systems with real robot bodies through wireless technology. In this framework the robot system can have a powerful vision system in the brain environment. We have applied this approach toward formation of vision-based dynamic and intelligent behaviors of mechanical animals such as doglike robots and apelike robots. In this paper we introduce the remote-brained approach and describe some remote-brained robots and visual processes for adaptive and interactive behaviors with them. >

[1]  Masayuki INABA,et al.  Hand Eye Coordination in Rope Handling , 1985 .

[2]  Daniel E. Koditschek,et al.  From stable to chaotic juggling: theory, simulation, and experiments , 1990, Proceedings., IEEE International Conference on Robotics and Automation.

[3]  Rodney A. Brooks,et al.  A Robust Layered Control Syste For A Mobile Robot , 2022 .

[4]  Masayuki Inaba,et al.  Robot vision system with a correlation chip for real-time tracking, optical flow and depth map generation , 1992, Proceedings 1992 IEEE International Conference on Robotics and Automation.

[5]  Robert B. McGhee,et al.  Adaptive Locomotion of a Multilegged Robot over Rough Terrain , 1979, IEEE Transactions on Systems, Man, and Cybernetics.

[6]  M. S. Konstantinov 4th International symposium on industrial robots , 1975 .

[7]  Masayuki Inaba,et al.  Vision-based multisensor integration in remote-brained robots , 1994, Proceedings of 1994 IEEE International Conference on MFI '94. Multisensor Fusion and Integration for Intelligent Systems.

[8]  Richard P. Paul,et al.  Robotics research : the First International Symposium , 1984 .

[9]  Martial Hebert,et al.  Vision and navigation for the Carnegie-Mellon Navlab , 1988 .

[10]  W. Eric L. Grimson,et al.  Handey: a task-level robot system , 1988 .

[11]  R. Bajcsy Active perception , 1988 .

[12]  Hans P. Moravec Visual Mapping by a Robot Rover , 1979, IJCAI.

[13]  대홍기획 디지털마케팅본부 롯데칠성 사이다DAY 디지털 캠페인을 통한 성공적 마케팅 커뮤니케이션 사례 , 1976 .

[14]  Masayuki Inaba,et al.  Remote-Brained Robotics : Interfacing AI with Real World Behaviors , 1993 .

[15]  Masayuki Inaba,et al.  Rope handling by mobile hand-eye robots , 1993 .

[16]  H. Miura,et al.  Dynamical walk of biped locomotion , 1983 .

[17]  Minoru Asada,et al.  Interpretation of video and range images for a mobile robot with dynamic semantic constraints , 1991 .

[18]  Dana H. Ballard,et al.  Animate Vision , 1991, Artif. Intell..

[19]  有本 卓,et al.  Robotics research : the Fifth International Symposium , 1990 .