Behzad Dariush Michael Gienger Arjun Arumbakkam Christian Goerick Youding Zhu Kikuo Fujimura

Transferring motion from a human demonstrator to a humanoid robot is an important step toward developing robots that are easily programmable and that can replicate or learn from observed human motion. The so called motion retargeting problem has been well studied and several off- line solutions exist based on optimization approaches that rely on pre-recorded human motion data collected from a marker-based motion capture system. From the perspective of human robot interaction, there is a growing interest in online and marker-less motion transfer. Such requirements have placed stringent demands on retargeting algorithms and limited the potential use of off-line and pre-recorded methods. To address these limitations, we present an online task space control theoretic retargeting formulation to generate robot joint motions that adhere to the robot's joint limit constraints, self- collision constraints, and balance constraints. The inputs to the proposed method include low dimensional normalized human motion descriptors, detected and tracked using a vision based feature detection and tracking algorithm. The proposed vision algorithm does not rely on markers placed on anatomical landmarks, nor does it require special instrumentation or calibration. The current implementation requires a depth image sequence, which is collected from a single time of flight imaging device. We present online experimental results of the entire pipeline on the Honda humanoid robot - ASIMO.

[1]  Ales Ude,et al.  Programming full-body movements for humanoid robots by observation , 2004, Robotics Auton. Syst..

[2]  Michael Gleicher,et al.  Retargetting motion to new characters , 1998, SIGGRAPH.

[3]  Hyeong-Seok Ko,et al.  Motion Balance Filtering , 2000, Comput. Graph. Forum.

[4]  Yoshihiko Nakamura,et al.  Advanced robotics - redundancy and optimization , 1990 .

[5]  Behzad Dariush,et al.  Whole body humanoid control from human motion descriptors , 2008, 2008 IEEE International Conference on Robotics and Automation.

[6]  Jean-Jacques E. Slotine,et al.  A general framework for managing multiple tasks in highly redundant robotic systems , 1991, Fifth International Conference on Advanced Robotics 'Robots in Unstructured Environments.

[7]  Christopher G. Atkeson,et al.  Adapting human motion for the control of a humanoid robot , 2002, Proceedings 2002 IEEE International Conference on Robotics and Automation (Cat. No.02CH37292).

[8]  Atsushi Nakazawa,et al.  Learning from Observation Paradigm: Leg Task Models for Enabling a Biped Humanoid Robot to Imitate Human Dances , 2007, Int. J. Robotics Res..

[9]  Stefan Schaal,et al.  Learning from Demonstration , 1996, NIPS.

[10]  Oussama Khatib,et al.  A whole-body control framework for humanoids operating in human environments , 2006, Proceedings 2006 IEEE International Conference on Robotics and Automation, 2006. ICRA 2006..

[11]  Michael Gienger,et al.  Task-oriented whole body motion for humanoid robots , 2005, 5th IEEE-RAS International Conference on Humanoid Robots, 2005..

[12]  H. Bekkering,et al.  Imitation of gestures in children is goal-directed. , 2000, The Quarterly journal of experimental psychology. A, Human experimental psychology.

[13]  Behzad Dariush,et al.  Controlled human pose estimation from depth image streams , 2008, 2008 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops.

[14]  Michael Gienger,et al.  Real-time collision avoidance with whole body motion control for humanoid robots , 2007, 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[15]  M. Hirose,et al.  Development of Humanoid Robot ASIMO , 2001 .

[16]  Rajiv V. Dubey,et al.  A weighted least-norm solution based scheme for avoiding joint limits for redundant joint manipulators , 1993, IEEE Trans. Robotics Autom..

[17]  Hyeong-Seok Ko,et al.  A physically-based motion retargeting filter , 2005, TOGS.

[18]  Kazuhito Yokoi,et al.  Imitating human dance motions through motion structure analysis , 2002, IEEE/RSJ International Conference on Intelligent Robots and Systems.

[19]  Mark Dunn,et al.  Visually Guided Whole Body Interaction , 2007, Proceedings 2007 IEEE International Conference on Robotics and Automation.

[20]  N. A. Bernshteĭn The co-ordination and regulation of movements , 1967 .

[21]  Sung Yong Shin,et al.  A hierarchical approach to interactive motion editing for human-like figures , 1999, SIGGRAPH.