Imitation-Based Task Programming on a Low-Cost Humanoid Robot

Humanoid robots are complex service platforms with anthropomorphic features, specifically designed for close interaction with humans. Conventional programming strategies are hardly applicable to humanoids due to the high number of degrees of freedom that must be coordinated concurrently. Therefore, exploiting humanoids’ potential in service tasks remains an elusive goal. One of the most promising techniques for dealing with humanoid robots is programming by demonstration, which allows even unexperienced users to easily interact with the robot based on the teaching by showing or imitation paradigm. In particular, the ability to imitate human gestures and follow task-relevant paths are essential skills for legged humanoid robots, as they provide the fundamental techniques for physical human-robot interaction. This chapter investigates the potential of imitation in programming humanoid motor skills. As target platform, we have adapted a Robosapien V2 (RSV2), a low-cost small humanoid available in the toy market. The chapter focuses on the teaching of basic, humanoid-relevant skills such as body postures and walking paths. We have explored and combined multiple sensing sources to capture human motion for imitation purposes, namely a dataglove, an electromagnetic motion tracker, and a monocular vision system for landmark recognition. The imitation approach illustrated in this chapter is rather general, even though its implementation is constrained by limitations of RSV2 and by sensor inaccuracies. In particular, the chapter reports successful experiments on gesture imitation, including arms motion as well as upper body and head movements. The gesture repertoire learned by the robot can serve both as a body language for understanding human requests in human-robot interaction and as a set of primitives which can be combined for programming more complex tasks. We believe that a deep assessment of a low-cost humanoid robot is extremely important for the robotic research community since the technological requirements and the costs to develop more advanced humanoid robots still prevent them to become broadly available. Currently, most high-end humanoids are developed as prototypes platforms under the supervision of important private companies. Therefore, low-cost humanoid platforms such as RSV2 provide an exciting and affordable opportunity for research in humanoid integration in service tasks.

[1]  Francisco Sandoval Hernández,et al.  A model-based humanoid perception system for real-time human motion imitation , 2004, IEEE Conference on Robotics, Automation and Mechatronics, 2004..

[2]  Hirokazu Kato,et al.  Marker tracking and HMD calibration for a video-based augmented reality conferencing system , 1999, Proceedings 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR'99).

[3]  Rüdiger Dillmann,et al.  Understanding users intention: programming fine manipulation tasks by demonstration , 2002, IEEE/RSJ International Conference on Intelligent Robots and Systems.

[4]  Sven Behnke,et al.  Playing Soccer with RoboSapien , 2005, RoboCup.

[5]  Rajesh P. N. Rao,et al.  Robotic imitation from human motion capture using Gaussian processes , 2005, 5th IEEE-RAS International Conference on Humanoid Robots, 2005..

[6]  Paulo Menezes,et al.  A single camera motion capture system dedicated to gestures imitation , 2005, 5th IEEE-RAS International Conference on Humanoid Robots, 2005..

[7]  Masayuki Inaba,et al.  Vision-based 2.5D terrain modeling for humanoid locomotion , 2003, 2003 IEEE International Conference on Robotics and Automation (Cat. No.03CH37422).

[8]  Hideki Hashimoto,et al.  Human-following mobile robot in a distributed intelligent sensor network , 2004, IEEE Transactions on Industrial Electronics.

[9]  Christopher G. Atkeson,et al.  Adapting human motion for the control of a humanoid robot , 2002, Proceedings 2002 IEEE International Conference on Robotics and Automation (Cat. No.02CH37292).

[10]  Masayuki Inaba,et al.  Intent imitation using wearable motion capturing system with on-line teaching of task attention , 2005, 5th IEEE-RAS International Conference on Humanoid Robots, 2005..

[11]  Aude Billard,et al.  Goal-Directed Imitation in a Humanoid Robot , 2005, Proceedings of the 2005 IEEE International Conference on Robotics and Automation.

[12]  Takeo Kanade,et al.  Vision-guided humanoid footstep planning for dynamic environments , 2005, 5th IEEE-RAS International Conference on Humanoid Robots, 2005..

[13]  Masayuki Inaba,et al.  From visuo-motor self learning to early imitation-a neural architecture for humanoid learning , 2003, 2003 IEEE International Conference on Robotics and Automation (Cat. No.03CH37422).

[14]  Atsushi Nakazawa,et al.  Task model of lower body motion for a biped humanoid robot to imitate human dances , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[15]  Takashi Emura,et al.  Trajectory generation for wheeled mobile robot based on landmarks , 1995, 1995 IEEE International Conference on Systems, Man and Cybernetics. Intelligent Systems for the 21st Century.

[16]  Ales Ude,et al.  Enabling real-time full-body imitation: a natural way of transferring human movement to humanoids , 2003, 2003 IEEE International Conference on Robotics and Automation (Cat. No.03CH37422).

[17]  Takeo Kanade,et al.  Online environment reconstruction for biped navigation , 2006, Proceedings 2006 IEEE International Conference on Robotics and Automation, 2006. ICRA 2006..

[18]  Jun Tani,et al.  On-line imitative interaction with a humanoid robot using a mirror neuron model , 2004, IEEE International Conference on Robotics and Automation, 2004. Proceedings. ICRA '04. 2004.

[19]  Kazuhito Yokoi,et al.  Imitating human dance motions through motion structure analysis , 2002, IEEE/RSJ International Conference on Intelligent Robots and Systems.

[20]  Aude Billard,et al.  Robota: Clever toy and educational tool , 2003, Robotics Auton. Syst..

[21]  Yangsheng Xu,et al.  Learning human navigational skill for smart wheelchair , 2002, IEEE/RSJ International Conference on Intelligent Robots and Systems.

[22]  Stefan Schaal,et al.  Is imitation learning the route to humanoid robots? , 1999, Trends in Cognitive Sciences.

[23]  Pradeep K. Khosla,et al.  Learning by observation with mobile robots: a computational approach , 2004, IEEE International Conference on Robotics and Automation, 2004. Proceedings. ICRA '04. 2004.

[24]  Kerstin Dautenhahn,et al.  Challenges in Building Robots That Imitate People , 2002 .

[25]  Takashi Minato,et al.  Generating natural motion in an android by mapping human motion , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[26]  Shin'ichi Yuta,et al.  Vision based navigation for mobile robots in indoor environment by teaching and playing-back scheme , 2001, Proceedings 2001 ICRA. IEEE International Conference on Robotics and Automation (Cat. No.01CH37164).

[27]  Les A. Piegl,et al.  On NURBS: a survey , 1991, IEEE Computer Graphics and Applications.

[28]  Maja J. Mataric,et al.  Getting Humanoids to Move and Imitate , 2000, IEEE Intell. Syst..

[29]  Dong-Soo Kwon,et al.  Mobile robots at your fingertip: Bezier curve on-line trajectory generation for supervisory control , 2003, Proceedings 2003 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2003) (Cat. No.03CH37453).

[30]  Yoshihiko Nakamura,et al.  Imitation and primitive symbol acquisition of humanoids by the integrated mimesis loop , 2001, Proceedings 2001 ICRA. IEEE International Conference on Robotics and Automation (Cat. No.01CH37164).

[31]  Thavida Maneewarn,et al.  Self-organizing approach for robot's behavior imitation , 2006, Proceedings 2006 IEEE International Conference on Robotics and Automation, 2006. ICRA 2006..

[32]  Katsushi Ikeuchi,et al.  Toward an assembly plan from observation. I. Task recognition with polyhedral objects , 1994, IEEE Trans. Robotics Autom..