Gesture-based human-robot interface for dual-robot with hybrid sensors

The purpose of this paper is the research of a novel gesture-based dual-robot collaborative interaction interface, which achieves the gesture recognition when both hands overlap. This paper designs a hybrid-sensor gesture recognition platform to detect the both-hand data for dual-robot control.,This paper uses a combination of Leap Motion and PrimeSense in the vertical direction, which detects both-hand data in real time. When there is occlusion between hands, each hand is detected by one of the sensors, and a quaternion-based algorithm is used to realize the conversion of two sensors corresponding to different coordinate systems. When there is no occlusion, the data are fused by a self-adaptive weight fusion algorithm. Then the collision detection algorithm is used to detect the collision between robots to ensure safety. Finally, the data are transmitted to the dual robots.,This interface is implemented on a dual-robot system consisting of two 6-DOF robots. The dual-robot cooperative experiment indicates that the proposed interface is feasible and effective, and it takes less time to operate and has higher interaction efficiency.,A novel gesture-based dual-robot collaborative interface is proposed. It overcomes the problem of gesture occlusion in two-hand interaction with low computational complexity and low equipment cost. The proposed interface can perform a long-term stable tracking of the two-hand gestures even if there is occlusion between the hands. Meanwhile, it reduces the number of hand reset to reduce the operation time. The proposed interface achieves a natural and safe interaction between the human and the dual robot.

[1]  Gershon Elber,et al.  Continuous Collision Detection for Ellipsoids , 2009, IEEE Transactions on Visualization and Computer Graphics.

[2]  Zhiwei Guo,et al.  Continuous tool wear prediction based on Gaussian mixture regression model , 2013 .

[3]  Francisco A. Candelas,et al.  Safe human-robot interaction based on dynamic sphere-swept line bounding volumes , 2011 .

[4]  Sander Oude Elberink,et al.  Accuracy and Resolution of Kinect Depth Data for Indoor Mapping Applications , 2012, Sensors.

[5]  Bin He,et al.  Noninvasive Electroencephalogram Based Control of a Robotic Arm for Reach and Grasp Tasks , 2016, Scientific Reports.

[6]  Yukang Liu,et al.  Toward Welding Robot With Human Knowledge: A Remotely-Controlled Approach , 2015, IEEE Transactions on Automation Science and Engineering.

[7]  Sebastian Thrun,et al.  A Gesture Based Interface for Human-Robot Interaction , 2000, Auton. Robots.

[8]  Anupam Agrawal,et al.  Vision based hand gesture recognition for human computer interaction: a survey , 2012, Artificial Intelligence Review.

[9]  Takayuki Kanda,et al.  Analysis of Humanoid Appearances in Human-Robot Interaction , 2008, IEEE Trans. Robotics.

[10]  Fernando Torres Medina,et al.  Control and Guidance of Low-Cost Robots via Gesture Perception for Monitoring Activities in the Home , 2015, Sensors.

[11]  Jianwei Zhang,et al.  Multi-LeapMotion sensor based demonstration for robotic refine tabletop object manipulation task , 2016, CAAI Trans. Intell. Technol..

[12]  Frank Weichert,et al.  Analysis of the Accuracy and Robustness of the Leap Motion Controller , 2013, Sensors.

[13]  Quan Pan,et al.  Multi-Sensor Adaptive Filter Data Fusion Algorithm: Multi-Sensor Adaptive Filter Data Fusion Algorithm , 2011 .

[14]  Guanglong Du,et al.  A Markerless Human–Robot Interface Using Particle Filter and Kalman Filter for Dual Robots , 2015, IEEE Transactions on Industrial Electronics.

[15]  Ying Wu,et al.  Nonstationary color tracking for vision-based human-computer interaction , 2002, IEEE Trans. Neural Networks.

[16]  Guanglong Du,et al.  Ensuring safety in human-robot coexisting environment based on two-level protection , 2016, Ind. Robot.

[17]  Angelo Cangelosi,et al.  Teleoperation control of Baxter robot using Kalman filter-based sensor fusion , 2017 .