An Anthropomorphic Hand with Five Fingers Controlled by a Motion Leap Device

Abstract This paper presents a new solution for command and control of the one anthropomorphic gripper with five fingers intended to be used in industrial robots equipment assemblies used for achieving low and medium complexity. The command solution is based on Motion Leap device and some software module: HandCommander, HandProcessor and HandSIM. The object to be gripped is recognized, using the SpatialVision application based on the image analysis, the 3D model is loaded in the GraspIT application. The user gesture is recognized and sent to the gripping test module and the RoboHand component to grip the objects preconfigured. The object is gripped in the physical environment by the RoboHand component, the anthropomorphic gripper with five fingers. We shown as example tennis ball gripping.

[1]  Haruhisa Kawasaki,et al.  Dexterous anthropomorphic robot hand with distributed tactile sensor: Gifu hand II , 2002 .

[2]  Luc Van Gool,et al.  Speeded-Up Robust Features (SURF) , 2008, Comput. Vis. Image Underst..

[3]  Bernt Schiele,et al.  Local features for object class recognition , 2005, Tenth IEEE International Conference on Computer Vision (ICCV'05) Volume 1.

[4]  Antonio Torralba,et al.  Inverting and Visualizing Features for Object Detection , 2012, ArXiv.

[5]  Bill Triggs,et al.  Histograms of oriented gradients for human detection , 2005, 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05).

[6]  Yaser Sheikh,et al.  Monocular Object Detection Using 3D Geometric Primitives , 2012, ECCV.

[7]  Peter K. Allen,et al.  Graspit! A versatile simulator for robotic grasping , 2004, IEEE Robotics & Automation Magazine.

[8]  Atsushi Shimada,et al.  Light Field Distortion Feature for Transparent Object Recognition , 2013, 2013 IEEE Conference on Computer Vision and Pattern Recognition.

[9]  F. Mahmood,et al.  A Self-Organizing Neural Scheme for Door Detection in Different Environments , 2013, ArXiv.

[10]  Jun Li,et al.  Combining contour and shape primitives for object detection and pose estimation of prefabricated parts , 2013, 2013 IEEE International Conference on Image Processing.

[11]  Joze Guna,et al.  An Analysis of the Precision and Reliability of the Leap Motion Sensor and Its Suitability for Static and Dynamic Tracking , 2014, Sensors.

[12]  Eugene Ch'ng New Ways of Accessing Information Spaces Using 3D Multitouch Tables , 2012, 2012 International Conference on Cyberworlds.

[13]  Konrad Schindler,et al.  Explicit Occlusion Modeling for 3D Object Class Representations , 2013, 2013 IEEE Conference on Computer Vision and Pattern Recognition.

[14]  G LoweDavid,et al.  Distinctive Image Features from Scale-Invariant Keypoints , 2004 .

[15]  Frank Weichert,et al.  Analysis of the Accuracy and Robustness of the Leap Motion Controller , 2013, Sensors.

[16]  Yee-Hong Yang,et al.  Noise robust rotation invariant features for texture classification , 2013, Pattern Recognit..

[17]  Luc Van Gool,et al.  SURF: Speeded Up Robust Features , 2006, ECCV.

[18]  Marco Ceccarelli,et al.  Design and tests of a three finger hand with 1-DOF articulated fingers , 2005, Robotica.

[19]  Anton van den Hengel,et al.  Training Effective Node Classifiers for Cascade Classification , 2013, International Journal of Computer Vision.

[20]  Shi-Hui Zhang,et al.  A Self-Occlusion Detection Approach Based on Depth Image Using SVM: , 2012 .