Integration of Camera and Inertial Measurement Unit for Entire Human Robot Interaction Using Machine Learning Algorithm

Importance of robots in industrial applications is a well-known fact. There are many tasks in industries, which can't be done alone by a robot or alone by a worker. Therefore, there is a need to establish a reliable and safe interaction environment between robots and workers. To do so, some information about the worker should be conveyed to the robot. This article focuses on industrial Human-Robot Interaction. For a safe and efficient Human-Robot Interaction, a robot needs to know about worker's position, posture and gesture. This paper proposes integration of an Inertial Measurement Unit (IMU) and a 3D camera as an image sensor to eliminate each other drawbacks and applies machine learning algorithm to detect postures and gestures of worker. Some experimental results during the interaction with heavy-duty robot will be presented

[1]  Fumio Miyazaki,et al.  Contour Based Hierarchical Part Decomposition Method for Human Body Motion Analysis from Video Sequence , 2001 .

[2]  Chih-Jen Lin,et al.  LIBSVM: A library for support vector machines , 2011, TIST.

[3]  David J. Fleet,et al.  Physics-Based Person Tracking Using the Anthropomorphic Walker , 2010, International Journal of Computer Vision.

[4]  Bodo Rosenhahn,et al.  Model-Based Pose Estimation , 2011, Visual Analysis of Humans.

[5]  Mohamad Bdiwi,et al.  Zone-Based Robot Control for Safe and Efficient Interaction between Human and Industrial Robots , 2017, HRI.

[6]  Chi-Woong Mun,et al.  Comparison of k-nearest neighbor, quadratic discriminant and linear discriminant analysis in classification of electromyogram signals based on the wrist-motion directions , 2011 .

[7]  Martin Buss,et al.  Handshake: Realistic Human-Robot Interaction in Haptic Enhanced Virtual Reality , 2011, PRESENCE: Teleoperators and Virtual Environments.

[8]  Andrew W. Fitzgibbon,et al.  Real-time human pose recognition in parts from single depth images , 2011, CVPR 2011.

[9]  Odest Chadwicke Jenkins,et al.  Physical simulation for probabilistic motion tracking , 2008, 2008 IEEE Conference on Computer Vision and Pattern Recognition.

[10]  Cristian Sminchisescu,et al.  BM³E : Discriminative Density Propagation for Visual Tracking , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[11]  Marko Pfeifer,et al.  A new strategy for ensuring human safety during various levels of interaction with industrial robots , 2017 .

[12]  Karon E. MacLean,et al.  Gestures for industry Intuitive human-robot communication from human observation , 2013, 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[13]  Hans-Peter Seidel,et al.  Fast articulated motion tracking using a sums of Gaussians body model , 2011, 2011 International Conference on Computer Vision.

[14]  Jonathan Kofman,et al.  Robot-Manipulator Teleoperation by Markerless Vision-Based Hand-Arm Tracking , 2007 .