More Than a Feeling: Learning to Grasp and Regrasp Using Vision and Touch

For humans, the process of grasping an object relies heavily on rich tactile feedback. Most recent robotic grasping work, however, has been based only on visual input, and thus cannot easily benefit from feedback after initiating contact. In this letter, we investigate how a robot can learn to use tactile information to iteratively and efficiently adjust its grasp. To this end, we propose an end-to-end action-conditional model that learns regrasping policies from raw visuo-tactile data. This model—a deep, multimodal convolutional network—predicts the outcome of a candidate grasp adjustment, and then executes a grasp by iteratively selecting the most promising actions. Our approach requires neither calibration of the tactile sensors nor any analytical modeling of contact forces, thus reducing the engineering effort required to obtain efficient grasping policies. We train our model with data from about 6450 grasping trials on a two-finger gripper equipped with GelSight high-resolution tactile sensors on each finger. Across extensive experiments, our approach outperforms a variety of baselines at 1) estimating grasp adjustment outcomes, 2) selecting efficient grasp adjustments for quick grasping, and 3) reducing the amount of force applied at the fingers, while maintaining competitive performance. Finally, we study the choices made by our model and show that it has successfully acquired useful and interpretable grasping behaviors.

[1]  Danica Kragic,et al.  Probabilistic consolidation of grasp experience , 2016, 2016 IEEE International Conference on Robotics and Automation (ICRA).

[2]  Jian Sun,et al.  Deep Residual Learning for Image Recognition , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[3]  Stefan Leutenegger,et al.  Deep learning a grasp function for grasping under gripper pose uncertainty , 2016, 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[4]  R. Dillmann,et al.  Learning continuous grasp stability for a humanoid robot hand based on tactile sensing , 2012, 2012 4th IEEE RAS & EMBS International Conference on Biomedical Robotics and Biomechatronics (BioRob).

[5]  Edward H. Adelson,et al.  Measurement of shear and slip with a GelSight tactile sensor , 2015, 2015 IEEE International Conference on Robotics and Automation (ICRA).

[6]  Andrew T. Miller,et al.  Integration of Vision , Force and Tactile Sensing for Grasping , 1999 .

[7]  Karun B. Shimoga,et al.  Robot Grasp Synthesis Algorithms: A Survey , 1996, Int. J. Robotics Res..

[8]  Gaurav S. Sukhatme,et al.  Self-supervised regrasping using spatio-temporal tactile features and reinforcement learning , 2016, 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[9]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[10]  Peter K. Allen,et al.  Data-driven grasping , 2011, Auton. Robots.

[11]  Yasemin Bekiroglu,et al.  Learning to Assess Grasp Stability from Vision, Touch and Proprioception , 2012 .

[12]  Jan Peters,et al.  Stabilizing novel objects by learning to predict tactile slip , 2015, 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[13]  Alberto Rodriguez,et al.  From caging to grasping , 2011, Int. J. Robotics Res..

[14]  Danica Kragic,et al.  Learning of grasp adaptation through experience and tactile sensing , 2014, 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[15]  Edward H. Adelson,et al.  Improved GelSight tactile sensor for measuring geometry and slip , 2017, 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[16]  Jimmy A. Jørgensen,et al.  Assessing Grasp Stability Based on Learning and Haptic Data , 2011, IEEE Transactions on Robotics.

[17]  Kaspar Althoefer,et al.  Tactile sensing for dexterous in-hand manipulation in robotics-A review , 2011 .

[18]  Peter K. Allen,et al.  Stable grasping under pose uncertainty using tactile feedback , 2014, Auton. Robots.

[19]  Danica Kragic,et al.  Data-Driven Grasp Synthesis—A Survey , 2013, IEEE Transactions on Robotics.

[20]  Andrew Owens,et al.  The Feeling of Success: Does Touch Sensing Help Predict Grasp Outcomes? , 2017, CoRL.

[21]  Edward H. Adelson,et al.  Connecting Look and Feel: Associating the Visual and Tactile Properties of Physical Materials , 2017, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[22]  Juhan Nam,et al.  Multimodal Deep Learning , 2011, ICML.

[23]  Danica Kragic,et al.  Estimating tactile data for adaptive grasping of novel objects , 2017, 2017 IEEE-RAS 17th International Conference on Humanoid Robotics (Humanoids).

[24]  Ashutosh Saxena,et al.  Robotic Grasping of Novel Objects using Vision , 2008, Int. J. Robotics Res..

[25]  John Platt,et al.  Probabilistic Outputs for Support vector Machines and Comparisons to Regularized Likelihood Methods , 1999 .

[26]  Vincent Duchaine,et al.  Grasp stability assessment through unsupervised feature learning of tactile images , 2017, 2017 IEEE International Conference on Robotics and Automation (ICRA).

[27]  Edward H. Adelson,et al.  GelSight: High-Resolution Robot Tactile Sensors for Estimating Geometry and Force , 2017, Sensors.

[28]  Jeannette Bohg,et al.  Leveraging big data for grasp planning , 2015, 2015 IEEE International Conference on Robotics and Automation (ICRA).

[29]  Edward H. Adelson,et al.  Shape estimation in natural illumination , 2011, CVPR 2011.

[30]  Martin V. Butz,et al.  Self-supervised regrasping using spatio-temporal tactile features and reinforcement learning , 2016, IROS 2016.

[31]  Sergey Levine,et al.  Learning hand-eye coordination for robotic grasping with deep learning and large-scale data collection , 2016, Int. J. Robotics Res..

[32]  Chao Yang,et al.  Robotic grasping using visual and tactile sensing , 2017, Inf. Sci..

[33]  Honglak Lee,et al.  Deep learning for detecting robotic grasps , 2013, Int. J. Robotics Res..

[34]  Yin Li,et al.  Learning to Grasp Without Seeing , 2018, ISER.

[35]  Andrew Owens,et al.  Shape-independent hardness estimation using deep learning and a GelSight tactile sensor , 2017, 2017 IEEE International Conference on Robotics and Automation (ICRA).

[36]  Shimon Edelman,et al.  Learning to grasp using visual information , 1996, Proceedings of IEEE International Conference on Robotics and Automation.

[37]  Maria Bauzá,et al.  Tactile Regrasp: Grasp Adjustments via Simulated Tactile Transformations , 2018, 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[38]  Sachin Chitta,et al.  Human-Inspired Robotic Grasp Control With Tactile Sensing , 2011, IEEE Transactions on Robotics.

[39]  Mathieu Aubry,et al.  Dex-Net 1.0: A cloud-based network of 3D objects for robust grasp planning using a Multi-Armed Bandit model with correlated rewards , 2016, 2016 IEEE International Conference on Robotics and Automation (ICRA).

[40]  Li Fei-Fei,et al.  ImageNet: A large-scale hierarchical image database , 2009, CVPR.

[41]  Antonio Bicchi,et al.  Integrated Tactile Sensing for Gripper Fingers , 1988 .

[42]  E. Adelson,et al.  Retrographic sensing for the measurement of surface texture and shape , 2009, 2009 IEEE Conference on Computer Vision and Pattern Recognition.

[43]  Gaurav S. Sukhatme,et al.  Generalizing Regrasping with Supervised Policy Learning , 2016, ISER.

[44]  Jorge Pomares,et al.  Control Framework for Dexterous Manipulation Using Dynamic Visual Servoing and Tactile Sensors' Feedback , 2014, Sensors.

[45]  J. Randall Flanagan,et al.  Coding and use of tactile signals from the fingertips in object manipulation tasks , 2009, Nature Reviews Neuroscience.

[46]  Abhinav Gupta,et al.  Supersizing self-supervision: Learning to grasp from 50K tries and 700 robot hours , 2015, 2016 IEEE International Conference on Robotics and Automation (ICRA).

[47]  Hussein A. Abdullah,et al.  A Slip Detection and Correction Strategy for Precision Robot Grasping , 2016, IEEE/ASME Transactions on Mechatronics.

[48]  Ashutosh Saxena,et al.  Efficient grasping from RGBD images: Learning using a new rectangle representation , 2011, 2011 IEEE International Conference on Robotics and Automation.