Encoding Bi-manual Coordination Patterns From Human Demonstrations

Humans perform tasks such as bowl mixing bi-manually, but programming them on a robot can be challenging specially in tasks that require force control or on-line stiffness modulation. In this paper we first propose a user-friendly setup for demonstrating bi-manual tasks, while collecting complementary information on motion and forces sensed on a robotic arm, as well as the human hand configuration and grasp information. Secondly for learning the task we propose a method for extracting task constraints for each arm and coordination patterns between the arms. We use a statistical encoding of the data based on the extracted constraints and reproduce the task using a cartesian impedance controller.

[1]  Sang Hyoung Lee,et al.  Incremental learning of primitive skills from demonstration of a task , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[2]  R. Johansson,et al.  How a Lateralized Brain Supports Symmetrical Bimanual Tasks , 2006, PLoS biology.

[3]  Aude Billard,et al.  Task Parameterization Using Continuous Constraints Extracted From Human Demonstrations , 2015, IEEE Transactions on Robotics.

[4]  Aude Billard,et al.  Learning Stable Nonlinear Dynamical Systems With Gaussian Mixture Models , 2011, IEEE Transactions on Robotics.

[5]  A. Billard,et al.  Learning Stable Nonlinear Dynamical Systems With Gaussian Mixture Models , 2011, IEEE Transactions on Robotics.

[6]  Aude Billard,et al.  Combining Dynamical Systems control and programming by demonstration for teaching discrete bimanual coordination tasks to a humanoid robot , 2008, 2008 3rd ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[7]  Rüdiger Dillmann,et al.  Representation and constrained planning of manipulation strategies in the context of Programming by Demonstration , 2010, 2010 IEEE International Conference on Robotics and Automation.