Real-time and Robust Collaborative Robot Motion Control with Microsoft Kinect ® v2

Recent development in depth sensing provide various opportunities for the development of new methods for Human Robot Interaction (HRI). Collaborative robots (co-bots) are redefining HRI across the manufacturing industry. However, little work has been done yet in the field of HRI with Kinect sensor in this industry. In this paper, we will present a HRI study using nearest-point approach with Microsoft Kinect v2 sensor’s depth image (RGB-D). The approach is based on the Euclidean distance which has robust properties against different environments. The study aims to improve the motion performance of Universal Robot–5 (UR5) and interaction efficiency during the possible collaboration using the Robot Operating System (ROS) framework and its tools. After the depth data from the Kinect sensor has been processed, the nearest points differences are transmitted to the robot via ROS.

[1]  Junsong Yuan,et al.  Depth camera based hand gesture recognition and its applications in Human-Computer-Interaction , 2011, 2011 8th International Conference on Information, Communications & Signal Processing.

[2]  Robin R. Murphy,et al.  Survey of metrics for human-robot interaction , 2013, 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[3]  Antonio Bicchi,et al.  An atlas of physical human-robot interaction , 2008 .

[4]  Ling Shao,et al.  Enhanced Computer Vision With Microsoft Kinect Sensor: A Review , 2013, IEEE Transactions on Cybernetics.

[5]  Zhengyou Zhang,et al.  Microsoft Kinect Sensor and Its Effect , 2012, IEEE Multim..

[6]  Danilo Giacomin Schneider,et al.  Robot Navigation by Gesture Recognition with ROS and Kinect , 2015, 2015 12th Latin American Robotics Symposium and 2015 3rd Brazilian Symposium on Robotics (LARS-SBR).

[7]  Dongin Shin,et al.  Visual guidance system for remote-operation , 2016, 2016 13th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI).

[8]  Morgan Quigley,et al.  ROS: an open-source Robot Operating System , 2009, ICRA 2009.

[9]  Robin R. Murphy,et al.  Hand gesture recognition with depth images: A review , 2012, 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication.

[10]  Zoltan-Csaba Marton,et al.  Tutorial: Point Cloud Library: Three-Dimensional Object Recognition and 6 DOF Pose Estimation , 2012, IEEE Robotics & Automation Magazine.

[11]  Radu Bogdan Rusu,et al.  3D is here: Point Cloud Library (PCL) , 2011, 2011 IEEE International Conference on Robotics and Automation.

[12]  Z. Liu,et al.  A real time system for dynamic hand gesture recognition with a depth sensor , 2012, 2012 Proceedings of the 20th European Signal Processing Conference (EUSIPCO).

[13]  Rich Wareham,et al.  libfreenect2: Release 0.2 , 2016 .

[14]  Signe Moe,et al.  Real-time hand guiding of industrial manipulator in 5 DOF using Microsoft Kinect and accelerometer , 2013, 2013 IEEE RO-MAN.

[15]  Swagat Kumar,et al.  A Hybrid Image Based Visual Servoing for a Manipulator using Kinect , 2017, AIR '17.

[16]  Kyrre Glette,et al.  Multi 3D camera mapping for predictive and reflexive robot manipulator trajectory estimation , 2016, 2016 IEEE Symposium Series on Computational Intelligence (SSCI).

[17]  Ankit Chaudhary,et al.  Tracking of Fingertips and Centers of Palm Using KINECT , 2011, 2011 Third International Conference on Computational Intelligence, Modelling & Simulation.

[18]  Alessandro De Luca,et al.  Integrated control for pHRI: Collision avoidance, detection, reaction and collaboration , 2012, 2012 4th IEEE RAS & EMBS International Conference on Biomedical Robotics and Biomechatronics (BioRob).

[19]  Jussi Suomela,et al.  The Concept of Future Worksite – Towards Teamwork-centered Field Robotic Systems , 2011 .

[20]  Per-Erik Forssén,et al.  Efficient Multi-frequency Phase Unwrapping Using Kernel Density Estimation , 2016, ECCV.

[21]  Yi Li,et al.  Hand gesture recognition using Kinect , 2012, 2012 IEEE International Conference on Computer Science and Automation Engineering.

[22]  Luke Fletcher,et al.  A Situationally Aware Voice‐commandable Robotic Forklift Working Alongside People in Unstructured Outdoor Environments , 2015, J. Field Robotics.

[23]  Sachin Chitta,et al.  MoveIt! [ROS Topics] , 2012, IEEE Robotics Autom. Mag..

[24]  Ji Wang,et al.  Optimal Collision-Free Robot Trajectory Generation Based on Time Series Prediction of Human Motion , 2018, IEEE Robotics and Automation Letters.

[25]  S. Burak Gokturk,et al.  A Time-Of-Flight Depth Sensor - System Description, Issues and Solutions , 2004, 2004 Conference on Computer Vision and Pattern Recognition Workshop.

[26]  Kyrre Glette,et al.  An Ultrasound Robotic System Using the Commercial Robot UR5 , 2016, Front. Robot. AI.