Unplanned, model-free, single grasp object classification with underactuated hands and force sensors

In this paper we present a methodology for discriminating between different objects using only a single force closure grasp with an underactuated robot hand equipped with force sensors. The technique leverages the benefits of simple, adaptive robot grippers (which can grasp successfully without prior knowledge of the hand or the object model), with an advanced machine learning technique (Random Forests). Unlike prior work in literature, the proposed methodology does not require object exploration, release or re-grasping and works for arbitrary object positions and orientations within the reach of a grasp. A two-fingered compliant, underactuated robot hand is controlled in an open-loop fashion to grasp objects with various shapes, sizes and stiffness. The Random Forests classification technique is used in order to discriminate between different object classes. The feature space used consists only of the actuator positions and the force sensor measurements at two specific time instances of the grasping process. A feature variables importance calculation procedure facilitates the identification of the most crucial features, concluding to the minimum number of sensors required. The efficiency of the proposed method is validated with two experimental paradigms involving two sets of fabricated model objects with different shapes, sizes and stiffness and a set of everyday life objects.

[1]  Aaron M. Dollar,et al.  A modular, open-source 3D printed underactuated hand , 2013, 2013 IEEE International Conference on Robotics and Automation.

[2]  Trevor Darrell,et al.  Using robotic exploratory procedures to learn the meaning of haptic adjectives , 2013, 2013 IEEE International Conference on Robotics and Automation.

[3]  Emmanuel Vander Poorten,et al.  Haptic feedback for medical applications, a survey , 2012 .

[4]  James M. Rehg,et al.  Haptic classification and recognition of objects using a tactile sensing forearm , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[5]  Shigeki Sugano,et al.  Tactile object recognition using deep learning and dropout , 2014, 2014 IEEE-RAS International Conference on Humanoid Robots.

[6]  Helge J. Ritter,et al.  Identifying Relevant Tactile Features for Object Identification , 2012, Towards Service Robots for Everyday Environments.

[7]  Jivko Sinapov,et al.  Vibrotactile Recognition and Categorization of Surfaces by a Humanoid Robot , 2011, IEEE Transactions on Robotics.

[8]  Jürgen Sturm,et al.  Tactile object class and internal state recognition for mobile manipulation , 2010, 2010 IEEE International Conference on Robotics and Automation.

[9]  Leo Breiman,et al.  Random Forests , 2001, Machine Learning.

[10]  James M. Rehg,et al.  Inferring Object Properties from Incidental Contact with a Tactile Sensing Forearm , 2014, ArXiv.

[11]  Wolfram Burgard,et al.  Object identification with tactile sensors using bag-of-features , 2009, 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[12]  Gerald E. Loeb,et al.  Bayesian Exploration for Intelligent Identification of Textures , 2012, Front. Neurorobot..

[13]  Gert Kootstra,et al.  Design of a flexible tactile sensor for classification of rigid and deformable objects , 2014, Robotics Auton. Syst..

[14]  Robert J. Wood,et al.  Flexible, stretchable tactile arrays from MEMS barometers , 2013, 2013 16th International Conference on Advanced Robotics (ICAR).

[15]  Siddhartha S. Srinivasa,et al.  Benchmarking in Manipulation Research: Using the Yale-CMU-Berkeley Object and Model Set , 2015, IEEE Robotics & Automation Magazine.

[16]  J. Randall Flanagan,et al.  Coding and use of tactile signals from the fingertips in object manipulation tasks , 2009, Nature Reviews Neuroscience.

[17]  Dennis Babu,et al.  Tactile sensing based softness classification using machine learning , 2014, 2014 IEEE International Advance Computing Conference (IACC).

[18]  Susan J. Lederman,et al.  Extracting object properties through haptic exploration. , 1993, Acta psychologica.

[19]  Yiannis Demiris,et al.  Incrementally Learning Objects by Touch: Online Discriminative and Generative Models for Tactile-Based Recognition , 2014, IEEE Transactions on Haptics.

[20]  Wolfram Burgard,et al.  Learning the elasticity parameters of deformable objects with a manipulation robot , 2010, 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[21]  Danfei Xu,et al.  Tactile identification of objects using Bayesian exploration , 2013, 2013 IEEE International Conference on Robotics and Automation.

[22]  Christine Servière,et al.  Tactile texture recognition with a 3-axial force MEMS integrated artificial finger , 2009, Robotics: Science and Systems.

[23]  Gert Kootstra,et al.  Classification of rigid and deformable objects using a novel tactile sensor , 2011, 2011 15th International Conference on Advanced Robotics (ICAR).

[24]  D. Yuh,et al.  Application of haptic feedback to robotic surgery. , 2004, Journal of laparoendoscopic & advanced surgical techniques. Part A.

[25]  Kit Yan Chan,et al.  A study of neural-network-based classifiers for material classification , 2014, Neurocomputing.

[26]  Danica Kragic,et al.  ST-HMP: Unsupervised Spatio-Temporal feature learning for tactile data , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).