Multimodal sensor-based whole-body control for human-robot collaboration in industrial settings

Abstract This paper describes the development of a dual-arm robotic system for industrial human–robot collaboration. The robot demonstrator described here possesses multiple sensor modalities for the monitoring of the shared human–robot workspace and is equipped with the ability for real-time collision-free dual-arm manipulation. A whole-body control framework is used as a key control element which generates a coherent output signal for the robot’s joints given the multiple controller inputs, tasks’ priorities, physical constraints, and current situation. Furthermore, sets of controller-constraints combinations of the whole-body controller constitute the basic building blocks that describe actions of a high-level action plan to be sequentially executed. In addition, the robotic system can be controlled in an intuitive manner via human gestures. These individual robotic capabilities are combined into an industrial demonstrator which is validated in a gearbox assembly station of a Volkswagen factory.

[1]  Sylvain Joyeux,et al.  Robot development: from components to systems , 2011 .

[2]  Oussama Khatib,et al.  Synthesis and control of whole-body behaviors in humanoid systems , 2007 .

[3]  Friedrich M. Wahl,et al.  Online Trajectory Generation: Basic Concepts for Instantaneous Reactions to Unforeseen Events , 2010, IEEE Transactions on Robotics.

[4]  Alin Albu-Schäffer,et al.  DLR's torque-controlled light weight robot III-are we reaching the technological limits now? , 2002, Proceedings 2002 IEEE International Conference on Robotics and Automation (Cat. No.02CH37292).

[5]  Malte Wirkus Towards Robot-independent Manipulation Behavior Description , 2014, ArXiv.

[6]  Oussama Khatib,et al.  Springer Handbook of Robotics , 2007, Springer Handbooks.

[7]  Kasper Claes,et al.  iTASC: a tool for multi-sensor integration in robot manipulation , 2008 .

[8]  Berthold Bäuml,et al.  Real-time swept volume and distance computation for self collision detection , 2011, 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[9]  Thomas M. Roehr,et al.  Modular Software for an Autonomous Space Rover , 2014 .

[10]  Manolis I. A. Lourakis,et al.  Real-Time Tracking of Multiple Skin-Colored Objects with a Possibly Moving Camera , 2004, ECCV.

[11]  Joris De Schutter,et al.  iTASC: a tool for multi-sensor integration in robot manipulation , 2008, 2008 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems.

[12]  Nico Blodow Managing Belief States for Service Robots , 2014 .

[13]  Ole Madsen,et al.  The mobile robot “Little Helper”: Concepts, ideas and working principles , 2009, 2009 IEEE Conference on Emerging Technologies & Factory Automation.

[14]  Dennis Mronga,et al.  TOWARDS DESCRIBING AND DEPLOYING WHOLE-BODY GENERIC MANIPULATION BEHAVIOURS , 2015 .

[15]  Sandra Hirche,et al.  Real-time human body motion estimation based on multi-layer laser scans , 2011, 2011 8th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI).

[16]  Konrad Wegener,et al.  Identifying the potential of human-robot collaboration in automotive assembly lines using a standardised work description , 2016 .

[17]  Christoph Walter,et al.  A projection-based sensor system for safe physical human-robot collaboration , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[18]  Simon Lacroix,et al.  Managing plans: Integrating deliberation and reactive execution schemes , 2010, Robotics Auton. Syst..

[19]  Joseph A. Paradiso,et al.  An Inertial Measurement Framework for Gesture Recognition and Applications , 2001, Gesture Workshop.

[20]  Frank Kirchner,et al.  Intuitive Interaction with Robots - Technical Approaches and Challenges , 2015, SyDe Summer School.

[21]  Ioannis Pitas,et al.  An analysis of facial expression recognition under partial facial image occlusion , 2008, Image Vis. Comput..

[22]  Luc Van Gool,et al.  Real-time 3D hand gesture interaction with a robot for understanding directions from humans , 2011, 2011 RO-MAN.

[23]  O. Khatib,et al.  Real-Time Obstacle Avoidance for Manipulators and Mobile Robots , 1985, Proceedings. 1985 IEEE International Conference on Robotics and Automation.

[24]  Antonis A. Argyros,et al.  Multiple objects tracking in the presence of long-term occlusions , 2010, Comput. Vis. Image Underst..

[25]  Herman Bruyninckx,et al.  Realtime Hybrid Task-Based Control for Robots and Machine Tools , 2005, Proceedings of the 2005 IEEE International Conference on Robotics and Automation.