PoseIt: A Visual-Tactile Dataset of Holding Poses for Grasp Stability Analysis

When humans grasp objects in the real world, we often move our arms to hold the object in a different pose where we can use it. In contrast, typical lab settings only study the stability of the grasp immediately after lifting, without any subsequent re-positioning of the arm. However, the grasp stability could vary widely based on the object's holding pose, as the gravitational torque and gripper contact forces could change completely. To facilitate the study of how holding poses affect grasp stability, we present PoseIt, a novel multi-modal dataset that contains visual and tactile data collected from a full cycle of grasping an object, re-positioning the arm to one of the sampled poses, and shaking the object. Using data from PoseIt, we can formulate and tackle the task of predicting whether a grasped object is stable in a particular held pose. We train an LSTM classifier that achieves 85% accuracy on the proposed task. Our experimental results show that multi-modal models trained on PoseIt achieve higher accuracy than using solely vision or tactile data and that our classifiers can also generalize to unseen objects and poses. The PoseIt dataset is publicly released here: https://github.com/CMURoboTouch/PoseIt.

[1]  E. Adelson,et al.  SwingBot: Learning Physical Features from In-hand Tactile Exploration for Dynamic Swing-up Manipulation , 2020, 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[2]  Colin Wei,et al.  Learning Imbalanced Datasets with Label-Distribution-Aware Margin Loss , 2019, NeurIPS.

[3]  Fernando Torres Medina,et al.  Learning Spatio Temporal Tactile Features with a ConvLSTM for the Direction Of Slip Detection , 2019, Sensors.

[4]  Bin Fang,et al.  Multimodal grasp data set: A novel visual–tactile data set for robotic manipulation , 2019, International Journal of Advanced Robotic Systems.

[5]  Yang Yang,et al.  FingerVision Tactile Sensor Design and Slip Detection Using Convolutional LSTM Network , 2018, ArXiv.

[6]  Jan Peters,et al.  Grip Stabilization of Novel Objects Using Slip Prediction , 2018, IEEE Transactions on Haptics.

[7]  Jitendra Malik,et al.  More Than a Feeling: Learning to Grasp and Regrasp Using Vision and Touch , 2018, IEEE Robotics and Automation Letters.

[8]  Yin Li,et al.  Learning to Grasp Without Seeing , 2018, ISER.

[9]  Jianhua Li,et al.  Slip Detection with Combined Tactile and Visual Information , 2018, 2018 IEEE International Conference on Robotics and Automation (ICRA).

[10]  Edward H. Adelson,et al.  GelSight: High-Resolution Robot Tactile Sensors for Estimating Geometry and Force , 2017, Sensors.

[11]  Andrew Owens,et al.  The Feeling of Success: Does Touch Sensing Help Predict Grasp Outcomes? , 2017, CoRL.

[12]  Atsuto Maki,et al.  A systematic study of the class imbalance problem in convolutional neural networks , 2017, Neural Networks.

[13]  Di Guo,et al.  Deep vision networks for real-time robotic grasp detection , 2016 .

[14]  Sergey Levine,et al.  Learning hand-eye coordination for robotic grasping with deep learning and large-scale data collection , 2016, Int. J. Robotics Res..

[15]  Jian Sun,et al.  Deep Residual Learning for Image Recognition , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[16]  Gaurav S. Sukhatme,et al.  Force estimation and slip detection/classification for grip control using a biomimetic tactile sensor , 2015, 2015 IEEE-RAS 15th International Conference on Humanoid Robots (Humanoids).

[17]  Abhinav Gupta,et al.  Supersizing self-supervision: Learning to grasp from 50K tries and 700 robot hours , 2015, 2016 IEEE International Conference on Robotics and Automation (ICRA).

[18]  Joseph Redmon,et al.  Real-time grasp detection using convolutional neural networks , 2014, 2015 IEEE International Conference on Robotics and Automation (ICRA).

[19]  Danica Kragic,et al.  ST-HMP: Unsupervised Spatio-Temporal feature learning for tactile data , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[20]  Danica Kragic,et al.  Data-Driven Grasp Synthesis—A Survey , 2013, IEEE Transactions on Robotics.

[21]  Danica Kragic,et al.  A probabilistic framework for task-oriented grasp stability assessment , 2013, 2013 IEEE International Conference on Robotics and Automation.

[22]  Anis Sahbani,et al.  An overview of 3D object grasp synthesis algorithms , 2012, Robotics Auton. Syst..

[23]  Danica Kragic,et al.  Learning tactile characterizations of object- and pose-specific grasps , 2011, 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[24]  Sachin Chitta,et al.  Human-Inspired Robotic Grasp Control With Tactile Sensing , 2011, IEEE Transactions on Robotics.

[25]  Jimmy A. Jørgensen,et al.  Assessing Grasp Stability Based on Learning and Haptic Data , 2011, IEEE Transactions on Robotics.

[26]  Matei T. Ciocarlie,et al.  Data-driven grasping with partial sensor data , 2009, 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[27]  Ashutosh Saxena,et al.  Robotic Grasping of Novel Objects using Vision , 2008, Int. J. Robotics Res..

[28]  Veronica J. Santos,et al.  Biomimetic Tactile Sensor Array , 2008, Adv. Robotics.

[29]  Henrik I. Christensen,et al.  Automatic grasp planning using shape primitives , 2003, 2003 IEEE International Conference on Robotics and Automation (Cat. No.03CH37422).

[30]  Hong Liu,et al.  On computing three-finger force-closure grasps of 2-D and 3-D objects , 2003, IEEE Trans. Robotics Autom..

[31]  Nathalie Japkowicz,et al.  The class imbalance problem: A systematic study , 2002, Intell. Data Anal..

[32]  Karun B. Shimoga,et al.  Robot Grasp Synthesis Algorithms: A Survey , 1996, Int. J. Robotics Res..

[33]  Van-Duc Nguyen,et al.  Constructing stable grasps in 3D , 1987, Proceedings. 1987 IEEE International Conference on Robotics and Automation.

[34]  Gaurav S. Sukhatme,et al.  BiGS: BioTac Grasp Stability Dataset , 2016 .