Task-Oriented Motion Mapping on Robots of Various Configuration Using Body Role Division

Many works in robot teaching either focus only on teaching task knowledge, such as geometric constraints, or motion knowledge, such as the motion for accomplishing a task. However, to effectively teach a complex task sequence to a robot, it is important to take advantage of both task and motion knowledge. The task knowledge provides the goals of each individual task within the sequence and reduces the number of required human demonstrations, whereas the motion knowledge contain the task-to-task constraints that would otherwise require expert knowledge to model the problem. In this letter, we propose a body role division approach that combines both types of knowledge using a single human demonstration. The method is inspired by facts on human body motion and uses a body structural analogy to decompose a robot’s body configuration into different roles: body parts that are dominant for imitating the human motion and body parts that are substitutional for adjusting the imitation with respect to the task knowledge. Our results show that our method scales to robots of different number of arm links, guides a robot’s configuration to one that achieves an upcoming task, and is potentially beneficial for teaching a range of task sequences.

[1]  Iori Yanokura,et al.  Verbal Focus-of-Attention System for Learning-from-Demonstration , 2020, ArXiv.

[2]  Pierre Sermanet,et al.  Grounding Language in Play , 2020, ArXiv.

[3]  Brett Browning,et al.  A survey of robot learning from demonstration , 2009, Robotics Auton. Syst..

[4]  Wolfram Burgard,et al.  Learning mobile manipulation actions from human demonstrations , 2017, 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[5]  Katsushi Ikeuchi,et al.  Toward automatic robot instruction from perception-mapping human grasps to manipulator grasps , 1997, IEEE Trans. Robotics Autom..

[6]  Fuminori Saito,et al.  Development of Human Support Robot as the research platform of a domestic mobile manipulator , 2019 .

[7]  Wolfram Burgard,et al.  Combined Task and Action Learning from Human Demonstrations for Mobile Manipulation Applications , 2019, 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[8]  Stefan Schaal,et al.  Robot Programming by Demonstration , 2009, Springer Handbook of Robotics.

[9]  Heni Ben Amor,et al.  Bayesian Interaction Primitives: A SLAM Approach to Human-Robot Interaction , 2017, CoRL.

[10]  Katsushi Ikeuchi,et al.  Toward an assembly plan from observation. I. Task recognition with polyhedral objects , 1994, IEEE Trans. Robotics Autom..

[11]  Behzad Dariush,et al.  Online Transfer of Human Motion to Humanoids , 2009, Int. J. Humanoid Robotics.

[12]  Sergey Levine,et al.  Learning Latent Plans from Play , 2019, CoRL.

[13]  Julie A. Shah,et al.  C-LEARN: Learning geometric constraints from demonstrations for multi-step manipulation in shared autonomy , 2017, 2017 IEEE International Conference on Robotics and Automation (ICRA).

[14]  Dongheui Lee,et al.  Incremental kinesthetic teaching of motion primitives using the motion refinement tube , 2011, Auton. Robots.

[15]  T. R. Kaminski,et al.  The coordination between trunk and arm motion during pointing movements , 2004, Experimental Brain Research.

[16]  Yoshihiro Sato,et al.  Describing Upper-Body Motions Based on Labanotation for Learning-from-Observation Robots , 2016, International Journal of Computer Vision.

[17]  Maya Cakmak,et al.  Trajectories and keyframes for kinesthetic teaching: A human-robot interaction perspective , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[18]  Sebastian Starke,et al.  A memetic evolutionary algorithm for real-time articulated kinematic motion , 2017, 2017 IEEE Congress on Evolutionary Computation (CEC).

[19]  J. F. Soechting,et al.  Coordination of arm and wrist motion during a reaching task , 1982, The Journal of neuroscience : the official journal of the Society for Neuroscience.

[20]  Ann Hutchinson Guest Labanotation: The System of Analyzing and Recording Movement , 1987 .

[21]  Ruxu Du,et al.  DESIGN AND ANALYSIS OF A BIOMIMETIC WIRE-DRIVEN ROBOT ARM , 2011 .

[22]  Satoshi Tadokoro,et al.  An analysis of human upper extremity motions , 1990 .

[23]  Anca D. Dragan,et al.  Learning from Physical Human Corrections, One Feature at a Time , 2018, 2018 13th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[24]  Masayuki Inaba,et al.  The Seednoid Robot Platform: Designing a Multipurpose Compact Robot From Continuous Evaluation and Lessons From Competitions , 2018, IEEE Robotics and Automation Letters.

[25]  S. Schaal Dynamic Movement Primitives -A Framework for Motor Control in Humans and Humanoid Robotics , 2006 .

[26]  Jonathan Claassens,et al.  An analytical solution for the inverse kinematics of a redundant 7DoF Manipulator with link offsets , 2010, 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[27]  Katsu Yamane,et al.  Animating non-humanoid characters with human motion data , 2010, SCA '10.

[28]  Jun Takamatsu,et al.  A Learning-from-Observation Framework: One-Shot Robot Teaching for Grasp-Manipulation-Release Household Operations , 2020, ArXiv.