Spatial programming for industrial robots based on gestures and Augmented Reality

The presented spatial programming system provides an assistance system for online programming of industrial robots. A handheld device and a motion tracking system establish the basis for a modular 3D programming approach corresponding to different phases of robot programming: definition, evaluation and adaption. Static and dynamic gestures enable the program definition of poses, trajectories and tasks. The spatial evaluation is done using an Augmented Reality application on a handheld device. Therefore, the programmer is able to move freely within the robot cell and define the program spatially through gestures. The camera image of the handheld is simultaneously enhanced by virtual objects representing the robot program. Based on 3D motion tracking of human movements and a mobile Augmented Reality application, we introduce a novel kind of interaction for the adaption of robot programs. The programmer is enabled to interact with virtual program components through bare-hand gestures. Such sample forms of interaction include translation and rotation applicable to poses, trajectories or tasks representations. Finally, the program is adapted according to the gestural changes and can be transferred from the handheld device directly to the robot controler.

[1]  Rainer Bischoff,et al.  Next Generation Teach Pendants for Industrial Robots , 2005 .

[2]  Thomas B. Schön,et al.  Sensor Fusion for Augmented Reality , 2006, 2006 9th International Conference on Information Fusion.

[3]  Antonis A. Argyros,et al.  Efficient model-based 3D tracking of hand articulations using Kinect , 2011, BMVC.

[4]  Jens Lambrecht,et al.  Control layer for multi-vendor industrial robot interaction providing integration of supervisory process control and multifunctional control units , 2011, 2011 IEEE Conference on Technologies for Practical Robot Applications.

[5]  Gary R. Bradski,et al.  Real time face and object tracking as a component of a perceptual user interface , 1998, Proceedings Fourth IEEE Workshop on Applications of Computer Vision. WACV'98 (Cat. No.98EX201).

[6]  Günther Seliger,et al.  Automated Image Based Recognition of Manual Work Steps in the Remanufacturing of Alternators , 2011 .

[7]  Jörg Krüger,et al.  Markerless gesture-based motion control and programming of industrial robots , 2011, ETFA2011.

[8]  Andrew Y. C. Nee,et al.  Augmented reality applications in manufacturing: a survey , 2008 .

[9]  Simon Forge,et al.  A Helping Hand for Europe: The Competitive Outlook for the EU Robotics Industry , 2010 .

[10]  SeungGwan Lee,et al.  Vision‐Based Finger Action Recognition by Angle Detection and Contour Analysis , 2011 .

[11]  Batu Akan,et al.  Augmented Reality Meets Industry: Interactive Robot Programming , 2010 .

[12]  Veronica Teichrieb,et al.  Standalone edge-based markerless tracking of fully 3-dimensional objects for handheld augmented reality , 2009, VRST '09.

[13]  Gunther Reinhart,et al.  A programming system for robot-based remote-laser-welding with conventional optics , 2008 .

[14]  Soh-Khim Ong,et al.  Robot Programming Using Augmented Reality , 2009, 2009 International Conference on CyberWorlds.

[15]  Soh-Khim Ong,et al.  Augmented assembly technologies based on 3D bare-hand interaction , 2011 .

[16]  Huosheng Hu,et al.  Integration of Vision and Inertial Sensors for 3D Arm Motion Tracking in Home-based Rehabilitation , 2007, Int. J. Robotics Res..

[17]  Pedro Neto,et al.  Accelerometer-based control of an industrial robotic arm , 2009, RO-MAN 2009 - The 18th IEEE International Symposium on Robot and Human Interactive Communication.

[18]  Lars Asplund,et al.  Intuitive industrial robot programming through incremental multimodal language and augmented reality , 2011, 2011 IEEE International Conference on Robotics and Automation.

[19]  Jörg Krüger,et al.  Stable, adaptive interaction and contact transition control of a high inertia haptic interface for haptic simulation in gait rehabilitation , 2011, 2011 IEEE International Conference on Robotics and Automation.

[20]  Sahin Albayrak,et al.  A Trajectory-Based Approach for Device Independent Gesture Recognition in Multimodal User Interfaces , 2010, HAID.

[21]  Jörg Krüger,et al.  Intelligent Assist Systems for Flexible Assembly , 2006 .

[22]  Markus Vincze,et al.  Fusion of Vision and Inertial Data for Motion and Structure Estimation , 2004, J. Field Robotics.

[23]  Suya You,et al.  Fusion of vision and gyro tracking for robust augmented reality registration , 2001, Proceedings IEEE Virtual Reality 2001.