Spatial Programming for Industrial Robots through Task Demonstration

Abstract We present an intuitive system for the programming of industrial robots using markerless gesture recognition and mobile augmented reality in terms of programming by demonstration. The approach covers gesture-based task definition and adaption by human demonstration, as well as task evaluation through augmented reality. A 3D motion tracking system and a handheld device establish the basis for the presented spatial programming system. In this publication, we present a prototype toward the programming of an assembly sequence consisting of several pick-and-place tasks. A scene reconstruction provides pose estimation of known objects with the help of the 2D camera of the handheld. Therefore, the programmer is able to define the program through natural bare-hand manipulation of these objects with the help of direct visual feedback in the augmented reality application. The program can be adapted by gestures and transmitted subsequently to an arbitrary industrial robot controller using a unified interface. Finally, we discuss an application of the presented spatial programming approach toward robot-based welding tasks.

[1]  Gunther Reinhart,et al.  A programming system for robot-based remote-laser-welding with conventional optics , 2008 .

[2]  Tamim Asfour,et al.  Programming by demonstration: dual-arm manipulation tasks for humanoid robots , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[3]  Weidong Huang,et al.  Supporting hand gestures in mobile remote collaboration: a usability evaluation , 2011, BCS HCI.

[4]  Antonis A. Argyros,et al.  Efficient model-based 3D tracking of hand articulations using Kinect , 2011, BMVC.

[5]  T. Igarashi,et al.  TouchMe : An Augmented Reality Based Remote Robot Manipulation , 2011 .

[6]  Jens Lambrecht,et al.  Control layer for multi-vendor industrial robot interaction providing integration of supervisory process control and multifunctional control units , 2011, 2011 IEEE Conference on Technologies for Practical Robot Applications.

[7]  Jean Ponce,et al.  Accurate, Dense, and Robust Multiview Stereopsis , 2010, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[8]  Ch. Meyer,et al.  An Intuitive Teaching Method for Small and Medium Enterprises , 2006 .

[9]  Jörg Krüger,et al.  Markerless gesture-based motion control and programming of industrial robots , 2011, ETFA2011.

[10]  Andrew Y. C. Nee,et al.  Augmented reality applications in manufacturing: a survey , 2008 .

[11]  Raul Wirz,et al.  A multimodal interface to control a robot arm via the web: a case study on remote programming , 2005, IEEE Transactions on Industrial Electronics.

[12]  Lars Asplund,et al.  Intuitive industrial robot programming through incremental multimodal language and augmented reality , 2011, 2011 IEEE International Conference on Robotics and Automation.

[13]  Soh-Khim Ong,et al.  Assembly Design and Evaluation Based on Bare-Hand Interaction in an Augmented Reality Environment , 2009, 2009 International Conference on CyberWorlds.

[14]  Batu Akan,et al.  Augmented Reality Meets Industry , 2010, SIGRAD.

[15]  Andrew Y. C. Nee,et al.  Interactive robot trajectory planning and simulation using Augmented Reality , 2012 .

[16]  Batu Akan,et al.  Augmented Reality Meets Industry: Interactive Robot Programming , 2010 .

[17]  Sahin Albayrak,et al.  A Trajectory-Based Approach for Device Independent Gesture Recognition in Multimodal User Interfaces , 2010, HAID.

[18]  Andrew Y. C. Nee,et al.  An Application of Augmented Reality (AR) in the Teaching of an Arc Welding Robot , 2005 .

[19]  R. Dillmann,et al.  Interactive Natural Programming of Robots : Introductory Overview , 2002 .

[20]  SeungGwan Lee,et al.  Vision‐Based Finger Action Recognition by Angle Detection and Contour Analysis , 2011 .

[21]  Andrew Y. C. Nee,et al.  GARDE: a gesture-based augmented reality design evaluation system , 2011 .

[22]  Pedro Neto,et al.  High-level programming and control for industrial robotics: using a hand-held accelerometer-based input device for gesture and posture recognition , 2010, Ind. Robot.

[23]  Judith Dijk,et al.  FIT3D toolbox: multiple view geometry and 3D reconstruction for Matlab , 2010, Security + Defence.

[24]  Rainer Bischoff,et al.  Next Generation Teach Pendants for Industrial Robots , 2005 .

[25]  Tamim Asfour,et al.  Imitation Learning of Dual-Arm Manipulation Tasks in Humanoid Robots , 2006, 2006 6th IEEE-RAS International Conference on Humanoid Robots.

[26]  Tao Zhang,et al.  Gesture-based human-robot interaction using a knowledge-based software platform , 2006, Ind. Robot.

[27]  J. Norberto Pires New challenges for industrial robotic cell programming , 2009 .

[28]  Simon Forge,et al.  A Helping Hand for Europe: The Competitive Outlook for the EU Robotics Industry , 2010 .

[29]  Soh-Khim Ong,et al.  Augmented assembly technologies based on 3D bare-hand interaction , 2011 .

[30]  Jianliang Tang,et al.  Complete Solution Classification for the Perspective-Three-Point Problem , 2003, IEEE Trans. Pattern Anal. Mach. Intell..

[31]  Gary R. Bradski,et al.  Real time face and object tracking as a component of a perceptual user interface , 1998, Proceedings Fourth IEEE Workshop on Applications of Computer Vision. WACV'98 (Cat. No.98EX201).