Bayesian Grasp: Robotic visual stable grasp based on prior tactile knowledge

Robotic grasp detection is a fundamental capability for intelligent manipulation in unstructured environments. Previous work mainly employed visual and tactile fusion to achieve stable grasp, while, the whole process depending heavily on regrasping, which wastes much time to regulate and evaluate. We propose a novel way to improve robotic grasping: by using learned tactile knowledge, a robot can achieve a stable grasp from an image. First, we construct a prior tactile knowledge learning framework with novel grasp quality metric which is determined by measuring its resistance to external perturbations. Second, we propose a multi-phases Bayesian Grasp architecture to generate stable grasp configurations through a single RGB image based on prior tactile knowledge. Results show that this framework can classify the outcome of grasps with an average accuracy of 86% on known objects and 79% on novel objects. The prior tactile knowledge improves the successful rate of 55% over traditional vision-based strategies.

[1]  Claudio Melchiorri,et al.  Slip detection and control using tactile and force sensors , 2000 .

[2]  Jian Sun,et al.  Deep Residual Learning for Image Recognition , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[3]  Edward H. Adelson,et al.  GelSight: High-Resolution Robot Tactile Sensors for Estimating Geometry and Force , 2017, Sensors.

[4]  Fei-Fei Li,et al.  ImageNet: A large-scale hierarchical image database , 2009, 2009 IEEE Conference on Computer Vision and Pattern Recognition.

[5]  Geoffrey E. Hinton,et al.  ImageNet classification with deep convolutional neural networks , 2012, Commun. ACM.

[6]  Edward H. Adelson,et al.  Improved GelSight tactile sensor for measuring geometry and slip , 2017, 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[7]  Martin V. Butz,et al.  Self-supervised regrasping using spatio-temporal tactile features and reinforcement learning , 2016, IROS 2016.

[8]  Jimmy A. Jørgensen,et al.  Assessing Grasp Stability Based on Learning and Haptic Data , 2011, IEEE Transactions on Robotics.

[9]  Renaud Detry,et al.  Tactile-Vision Integration for Task-Compatible Fine-Part Manipulation , 2017 .

[10]  Danica Kragic,et al.  Estimating tactile data for adaptive grasping of novel objects , 2017, 2017 IEEE-RAS 17th International Conference on Humanoid Robotics (Humanoids).

[11]  Peter K. Allen,et al.  Stable grasping under pose uncertainty using tactile feedback , 2014, Auton. Robots.

[12]  Robert J. Wood,et al.  Flexible, stretchable tactile arrays from MEMS barometers , 2013, 2013 16th International Conference on Advanced Robotics (ICAR).

[13]  Ashutosh Saxena,et al.  Robotic Grasping of Novel Objects using Vision , 2008, Int. J. Robotics Res..

[14]  Andrew Owens,et al.  The Feeling of Success: Does Touch Sensing Help Predict Grasp Outcomes? , 2017, CoRL.

[15]  Nathan F. Lepora,et al.  Slip Detection With a Biomimetic Tactile Sensor , 2018, IEEE Robotics and Automation Letters.

[16]  Danica Kragic,et al.  ST-HMP: Unsupervised Spatio-Temporal feature learning for tactile data , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[17]  R. Dillmann,et al.  Learning continuous grasp stability for a humanoid robot hand based on tactile sensing , 2012, 2012 4th IEEE RAS & EMBS International Conference on Biomedical Robotics and Biomechatronics (BioRob).

[18]  Andrew Zisserman,et al.  Very Deep Convolutional Networks for Large-Scale Image Recognition , 2014, ICLR.

[19]  Alberto Rodriguez,et al.  From caging to grasping , 2011, Int. J. Robotics Res..

[20]  Jianhua Li,et al.  Slip Detection with Combined Tactile and Visual Information , 2018, 2018 IEEE International Conference on Robotics and Automation (ICRA).

[21]  Di Guo,et al.  Object Recognition Using Tactile Measurements: Kernel Sparse Coding Methods , 2016, IEEE Transactions on Instrumentation and Measurement.

[22]  Maria Bauzá,et al.  Tactile Regrasp: Grasp Adjustments via Simulated Tactile Transformations , 2018, 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[23]  Mansoor Alam,et al.  A Deep Learning Approach for Network Intrusion Detection System , 2016, EAI Endorsed Trans. Security Safety.

[24]  Fuchun Sun,et al.  Tactile sequence classification using joint kernel sparse coding , 2015, 2015 International Joint Conference on Neural Networks (IJCNN).

[25]  Jonathan Becedas,et al.  Micro-Vibration-Based Slip Detection in Tactile Force Sensors , 2014, Sensors.

[26]  Gaurav S. Sukhatme,et al.  Force estimation and slip detection/classification for grip control using a biomimetic tactile sensor , 2015, 2015 IEEE-RAS 15th International Conference on Humanoid Robots (Humanoids).

[27]  Helge J. Ritter,et al.  Tactile Convolutional Networks for Online Slip and Rotation Detection , 2016, ICANN.

[28]  E. Guglielmelli,et al.  Artificial Sense of Slip—A Review , 2013, IEEE Sensors Journal.

[29]  Fumio Kanehiro,et al.  Fast grasp planning for hand/arm systems based on convex model , 2008, 2008 IEEE International Conference on Robotics and Automation.

[30]  Vincent Duchaine,et al.  Grasp stability assessment through unsupervised feature learning of tactile images , 2017, 2017 IEEE International Conference on Robotics and Automation (ICRA).

[31]  Di Guo,et al.  A hybrid deep architecture for robotic grasp detection , 2017, 2017 IEEE International Conference on Robotics and Automation (ICRA).