Dual-Hand Detection for Human–Robot Interaction by a Parallel Network Based on Hand Detection and Body Pose Estimation

In this paper, a parallel network based on hand detection and body pose estimation is proposed to detect and distinguish human's right and left hands. The network is employed for human–robot interaction (HRI) based on hand gestures. This method fully uses hand feature information and hand information in the human body structure. One channel in the network uses a ResNet-Inception-Single Shot MultiBox Detector to extract hand feature information for human's hand detection. The other channel estimates human body pose first and then estimates the positions of the left and right hands using the forward kinematic tree of the human skeleton structure. Thereafter, the results of the two channels are fused. In the fusion module, the human body structure can be utilized to correct hand detection results and distinguish between the right and left hands. Experimental results verify that the parallel deep neural network can effectively improve the accuracy of hand detection and distinguish between the right and left hands effectively. This method is also used for the hand-gesture-based interaction between astronauts and an astronaut assistant robot. Our method can be suitably used in this HRI system.

[1]  Andrea Vedaldi,et al.  R-CNN minus R , 2015, BMVC.

[2]  Wei Liu,et al.  DSSD : Deconvolutional Single Shot Detector , 2017, ArXiv.

[3]  Jinguo Liu,et al.  Free-flying dynamics and control of an astronaut assistant robot based on fuzzy sliding mode algorithm , 2017 .

[4]  Myeongsu Kang,et al.  Deep Residual Networks With Dynamically Weighted Wavelet Coefficients for Fault Diagnosis of Planetary Gearboxes , 2018, IEEE Transactions on Industrial Electronics.

[5]  Ankit Chaudhary,et al.  Intelligent Approaches to interact with Machines using Hand Gesture Recognition in Natural way: A Survey , 2011, ArXiv.

[6]  Jinguo Liu,et al.  An Interactive Astronaut-Robot System with Gesture Control , 2016, Comput. Intell. Neurosci..

[7]  Pietro Perona,et al.  Fast Feature Pyramids for Object Detection , 2014, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[8]  Varun Ramakrishna,et al.  Convolutional Pose Machines , 2016, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[9]  A. Umamakeswari,et al.  Electroencephalogram-based Brain Controlled Robotic Wheelchair , 2015 .

[10]  Yangmin Li,et al.  Static Hand Gesture Recognition with Parallel CNNs for Space Human-Robot Interaction , 2017, ICIRA.

[11]  Wei Liu,et al.  SSD: Single Shot MultiBox Detector , 2015, ECCV.

[12]  Hakil Kim,et al.  Wide-residual-inception networks for real-time object detection , 2017, 2017 IEEE Intelligent Vehicles Symposium (IV).

[13]  Marios Savvides,et al.  Robust hand detection in Vehicles , 2016, 2016 23rd International Conference on Pattern Recognition (ICPR).

[14]  Ross B. Girshick,et al.  Fast R-CNN , 2015, 1504.08083.

[15]  Kaiming He,et al.  Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks , 2015, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[16]  Ali Farhadi,et al.  You Only Look Once: Unified, Real-Time Object Detection , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[17]  Guangming Shi,et al.  Feature-fused SSD: fast detection for small objects , 2017, International Conference on Graphic and Image Processing.

[18]  Moncef Gabbouj,et al.  Real-Time Motor Fault Detection by 1-D Convolutional Neural Networks , 2016, IEEE Transactions on Industrial Electronics.

[19]  Dandu Amarnatha Reddy Vision Based Hand Gesture Recognition for Human Computer Interaction , 2018 .

[20]  Yangmin Li,et al.  Attitude control for astronaut assisted robot in the space station , 2016 .

[21]  Moncef Gabbouj,et al.  Real-Time Fault Detection and Identification for MMC Using 1-D Convolutional Neural Networks , 2019, IEEE Transactions on Industrial Electronics.

[22]  Dirk Schulz,et al.  Real time interaction with mobile robots using hand gestures , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[23]  Ying-Wen Bai,et al.  Design and implementation of a four-quadrant and voice interaction user interface of a smartphone for the visually impaired users , 2016, 2016 IEEE 6th International Conference on Consumer Electronics - Berlin (ICCE-Berlin).

[24]  Shanxin Yuan,et al.  Opening the Black Box: Hierarchical Sampling Optimization for Hand Pose Estimation , 2019, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[25]  Wei Li,et al.  sEMG-Based Identification of Hand Motion Commands Using Wavelet Neural Network Combined With Discrete Wavelet Transform , 2016, IEEE Transactions on Industrial Electronics.

[26]  Yaser Sheikh,et al.  OpenPose: Realtime Multi-Person 2D Pose Estimation Using Part Affinity Fields , 2018, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[27]  Andrew Zisserman,et al.  Hand detection using multiple proposals , 2011, BMVC.

[28]  Marios Savvides,et al.  Robust Hand Detection and Classification in Vehicles and in the Wild , 2017, 2017 IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW).

[29]  Mathias Kölsch,et al.  Robust hand detection , 2004, Sixth IEEE International Conference on Automatic Face and Gesture Recognition, 2004. Proceedings..

[30]  Stefan Lee,et al.  Lending A Hand: Detecting Hands and Recognizing Activities in Complex Egocentric Interactions , 2015, 2015 IEEE International Conference on Computer Vision (ICCV).

[31]  Jagdish Lal Raheja,et al.  Real-Time Robotic Hand Control Using Hand Gestures , 2010, 2010 Second International Conference on Machine Learning and Computing.