Formulating Intuitive Stack-of-Tasks with Visuo-Tactile Perception for Collaborative Human-Robot Fine Manipulation

Enabling robots to work in close proximity to humans necessitates a control framework that does not only incorporate multi-sensory information for autonomous and coordinated interactions but also has perceptive task planning to ensure an adaptable and flexible collaborative behaviour. In this research, an intuitive stack-of-tasks (iSoT) formulation is proposed, that defines the robot’s actions by considering the human-arm postures and the task progression. The framework is augmented with visuo-tactile information to effectively perceive the collaborative environment and intuitively switch between the planned sub-tasks. The visual feedback from depth cameras monitors and estimates the objects’ poses and human-arm postures, while the tactile data provides the exploration skills to detect and maintain the desired contacts to avoid object slippage. To evaluate the performance, effectiveness and usability of the proposed framework, assembly and disassembly tasks, performed by the human-human and human-robot partners, are considered and analyzed using distinct evaluation metrics i.e, approach adaptation, grasp correction, task coordination latency, cumulative posture deviation, and task repeatability.

[1]  Michael Yu Wang,et al.  VTacArm. A Vision-based Tactile Sensing Augmented Robotic Arm with Application to Human-robot Interaction , 2020, 2020 IEEE 16th International Conference on Automation Science and Engineering (CASE).

[2]  O. Brock,et al.  Robots in Human Environments: Basic Autonomous Capabilities , 1999, Int. J. Robotics Res..

[3]  Kazuhiro Kosuge,et al.  Progress and prospects of the human–robot collaboration , 2017, Autonomous Robots.

[4]  Martin Peniak GPU computing for cognitive robotics , 2014 .

[5]  Oliver Kroemer,et al.  Combining active learning and reactive control for robot grasping , 2010, Robotics Auton. Syst..

[6]  Ciro Natale,et al.  Handover Control for Human-Robot and Robot-Robot Collaboration , 2021, Frontiers in Robotics and AI.

[7]  Vincenzo Lippiello,et al.  Human-robot interaction control using force and vision , 2007 .

[8]  G. Oriolo,et al.  Robotics: Modelling, Planning and Control , 2008 .

[9]  Jaime Valls Miró,et al.  Human-robot collaboration for safe object transportation using force feedback , 2018, Robotics Auton. Syst..

[10]  Andrea Cherubini,et al.  Sensor-Based Control for Collaborative Robots: Fundamentals, Challenges, and Opportunities , 2020, Frontiers in Neurorobotics.

[11]  Qing Li,et al.  Human-Robot Co-Carrying Using Visual and Force Sensing , 2020, IEEE Transactions on Industrial Electronics.

[12]  Fei Chen,et al.  Vision Based Adaptation to Kernelized Synergies for Human Inspired Robotic Manipulation , 2020, ArXiv.

[13]  Antonio Bicchi,et al.  An atlas of physical human-robot interaction , 2008 .

[14]  Oussama Khatib,et al.  A Depth Space Approach for Evaluating Distance to Objects , 2015, J. Intell. Robotic Syst..

[15]  Petr Novák,et al.  Improved Mutual Understanding for Human-Robot Collaboration: Combining Human-Aware Motion Planning with Haptic Feedback Devices for Communicating Planned Trajectory , 2021, Sensors.

[16]  Vincenzo Lippiello,et al.  Grasp planning and parallel control of a redundant dual-arm/hand manipulation system , 2013, Robotica.

[17]  Lihui Wang,et al.  Towards Robust Human-Robot Collaborative Manufacturing: Multimodal Fusion , 2018, IEEE Access.

[18]  Yusuke Maeda Takayuki Ham,et al.  Human-Robot Cooperative Manipulation with Motion Estimation , 2001 .

[19]  Kensuke Harada,et al.  Human-in-the-Loop Robotic Manipulation Planning for Collaborative Assembly , 2019, IEEE Transactions on Automation Science and Engineering.

[20]  Don Joven Agravante,et al.  Collaborative human-humanoid carrying using vision and haptic sensing , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[21]  Robin Passama,et al.  A unified multimodal control framework for human-robot interaction , 2015, Robotics Auton. Syst..

[22]  Tamio Arai,et al.  Human-robot cooperative manipulation with motion estimation , 2001, Proceedings 2001 IEEE/RSJ International Conference on Intelligent Robots and Systems. Expanding the Societal Role of Robotics in the the Next Millennium (Cat. No.01CH37180).

[23]  André Crosnier,et al.  Multimodal control for human-robot cooperation , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[24]  Vincenzo Lippiello,et al.  Hierarchical Task-Priority Control for Human-Robot Co-manipulation , 2019, HFR.

[25]  Domenico Prattichizzo,et al.  Human-Robot Formation Control via Visual and Vibrotactile Haptic Feedback , 2014, IEEE Transactions on Haptics.

[26]  E. Hoffman,et al.  A Whole-Body Stack-of-Tasks compliant control for the Humanoid Robot COMAN , 2014 .

[27]  Nikolaos G. Tsagarakis,et al.  Anticipatory Robot Assistance for the Prevention of Human Static Joint Overloading in Human–Robot Collaboration , 2018, IEEE Robotics and Automation Letters.

[28]  Alessandro De Luca,et al.  Visual coordination task for human-robot collaboration , 2017, 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[29]  Jeannette Bohg,et al.  Fusing visual and tactile sensing for 3-D object reconstruction while grasping , 2013, 2013 IEEE International Conference on Robotics and Automation.

[30]  Claudio Gaz,et al.  A model-based residual approach for human-robot collaboration during manual polishing operations , 2018, Mechatronics.

[31]  Alessandro De Luca,et al.  Human-robot coexistence and interaction in open industrial cells , 2020, Robotics Comput. Integr. Manuf..

[32]  Eric Maël,et al.  A sensor for dynamic tactile information with applications in human-robot interaction and object exploration , 2006, Robotics Auton. Syst..