Gesture-Based Extraction of Robot Skill Parameters for Intuitive Robot Programming

Despite a lot of research in the field, only very little experience exists with Teaching by Demonstration (TbD) in actual industrial use cases. In the factory of the future, it is necessary to rapidly reprogram flexible mobile manipulators to perform new tasks, when the need arises, for which a working system capable of TbD would be ideal. Contrary to current TbD approaches, that generally aim to recognize both action and where it is applied, we propose a division of labor, where the operator manually specifies the action the robot should perform, while gestures are used for specifying the relevant action parameter (e.g. on which object to apply the action). Using this two-step method has the advantages that there is no uncertainty of which action the robot will perform, it takes into account that the environment changes, so objects do not need to be at predefined locations, and the parameter specification is possible even for inexperienced users. Experiments with 24 people in 3 different environments verify that it is indeed intuitive, even for a robotics novice, to program a mobile manipulator using this method.

[1]  Alessandro Saffiotti,et al.  Robot task planning using semantic maps , 2008, Robotics Auton. Syst..

[2]  Colin C. Archibald A computational model for skills-oriented robot programming. , 1995 .

[3]  Moritz Tenorth,et al.  Representation and Exchange of Knowledge About Actions, Objects, and Environments in the RoboEarth Framework , 2013, IEEE Transactions on Automation Science and Engineering.

[4]  Nuno Mendes,et al.  Direct off-line robot programming via a common CAD package , 2013, Robotics Auton. Syst..

[5]  Anders Robertsson,et al.  On the integration of skilled robot motions for productivity in manufacturing , 2011, 2011 IEEE International Symposium on Assembly and Manufacturing (ISAM).

[6]  Dana Kulic,et al.  Learning Action Primitives , 2011, Visual Analysis of Humans.

[7]  Rainer Bischoff,et al.  Perspectives on augmented reality based human-robot interaction with industrial robots , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[8]  Weihua Sheng,et al.  Transformative CAD based industrial robot program generation , 2011 .

[9]  Francesco Chinello,et al.  KUKA Control Toolbox , 2011, IEEE Robotics & Automation Magazine.

[10]  Danica Kragic,et al.  Learning Actions from Observations , 2010, IEEE Robotics & Automation Magazine.

[11]  Katsushi Ikeuchi,et al.  Toward an assembly plan from observation. I. Task recognition with polyhedral objects , 1994, IEEE Trans. Robotics Autom..

[12]  Friedrich M. Wahl,et al.  Manipulation Primitives - A Universal Interface between Sensor-Based Motion Control and Robot Programming , 2011, Robotic Systems for Handling and Assembly.

[13]  Volker Krüger,et al.  Intuitive skill-level programming of industrial handling tasks on a mobile manipulator , 2014, 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[14]  Katsushi Ikeuchi,et al.  Toward automatic robot instruction from perception-mapping human grasps to manipulator grasps , 1997, IEEE Trans. Robotics Autom..

[15]  A F Bobick,et al.  Movement, activity and action: the role of knowledge in the perception of motion. , 1997, Philosophical transactions of the Royal Society of London. Series B, Biological sciences.

[16]  David Alan Bourne,et al.  Human/Robot Multi-initiative Setups for Assembly Cells , 2011 .

[17]  Mohammed Yeasin,et al.  Toward automatic robot programming: learning human skill from visual data , 2000, IEEE Trans. Syst. Man Cybern. Part B.

[18]  Ole Madsen,et al.  Autonomous industrial mobile manipulation (AIMM): past, present and future , 2012, Ind. Robot.

[19]  Konstantinos-Dionysios Bouzakis,et al.  Off-line programming of an industrial robot for manufacturing , 2005 .

[20]  Damian M. Lyons,et al.  Assembly and task planning: a taxonomy , 1994, IEEE Robotics & Automation Magazine.

[21]  Yoshihiko Nakamura,et al.  Embodied Symbol Emergence Based on Mimesis Theory , 2004, Int. J. Robotics Res..

[22]  Henrik I. Christensen,et al.  Planning with a task modeling framework in manufacturing robotics , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[23]  Danica Kragic,et al.  Robot Learning from Demonstration: A Task-level Planning Approach , 2008 .

[24]  Friedrich M. Wahl,et al.  Executing assembly tasks specified by manipulation primitive nets , 2005, Adv. Robotics.

[25]  Sven Molkenstruck,et al.  A manipulator plays Jenga , 2008, IEEE Robotics & Automation Magazine.

[26]  Christopher W. Geib,et al.  The meaning of action: a review on action recognition and mapping , 2007, Adv. Robotics.

[27]  Volker Krüger,et al.  On the Integration of Hardware-Abstracted Robot Skills for use in Industrial Scenarios , 2013 .

[28]  Katsushi Ikeuchi,et al.  Toward automatic robot instruction from perception-temporal segmentation of tasks from human hand motion , 1993, IEEE Trans. Robotics Autom..

[29]  Joris De Schutter,et al.  Specification of force-controlled actions in the "task frame formalism"-a synthesis , 1996, IEEE Trans. Robotics Autom..

[30]  Ole Madsen,et al.  Does your Robot have Skills , 2012 .

[31]  T. B. Moeslund,et al.  Evaluation of human body tracking system for gesture-based programming of industrial robots , 2012, 2012 7th IEEE Conference on Industrial Electronics and Applications (ICIEA).

[32]  Zahari Taha,et al.  VR-Based Robot Programming and Simulation System for an Industrial Robot , 2008 .

[33]  Aaron F. Bobick,et al.  On Human Action , 2011, Visual Analysis of Humans.

[34]  José Santos-Victor,et al.  A Developmental Roadmap for Learning by Imitation in Robots , 2007, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).