Learning the signatures of the human grasp using a scalable tactile glove

Humans can feel, weigh and grasp diverse objects, and simultaneously infer their material properties while applying the right amount of force—a challenging set of tasks for a modern robot1. Mechanoreceptor networks that provide sensory feedback and enable the dexterity of the human grasp2 remain difficult to replicate in robots. Whereas computer-vision-based robot grasping strategies3–5 have progressed substantially with the abundance of visual data and emerging machine-learning tools, there are as yet no equivalent sensing platforms and large-scale datasets with which to probe the use of the tactile information that humans rely on when grasping objects. Studying the mechanics of how humans grasp objects will complement vision-based robotic object handling. Importantly, the inability to record and analyse tactile signals currently limits our understanding of the role of tactile information in the human grasp itself—for example, how tactile maps are used to identify objects and infer their properties is unknown6. Here we use a scalable tactile glove and deep convolutional neural networks to show that sensors uniformly distributed over the hand can be used to identify individual objects, estimate their weight and explore the typical tactile patterns that emerge while grasping objects. The sensor array (548 sensors) is assembled on a knitted glove, and consists of a piezoresistive film connected by a network of conductive thread electrodes that are passively probed. Using a low-cost (about US$10) scalable tactile glove sensor array, we record a large-scale tactile dataset with 135,000 frames, each covering the full hand, while interacting with 26 different objects. This set of interactions with different objects reveals the key correspondences between different regions of a human hand while it is manipulating objects. Insights from the tactile signatures of the human grasp—through the lens of an artificial analogue of the natural mechanoreceptor network—can thus aid the future design of prosthetics7, robot grasping tools and human–robot interactions1,8–10.Tactile patterns obtained from a scalable sensor-embedded glove and deep convolutional neural networks help to explain how the human hand can identify and grasp individual objects and estimate their weights.

[1]  Berthold Bäuml,et al.  Robust material classification with a tactile skin using deep learning , 2016, 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[2]  Jonathan Tompson,et al.  Efficient object localization using Convolutional Networks , 2014, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[3]  Sachin Chitta,et al.  Human-Inspired Robotic Grasp Control With Tactile Sensing , 2011, IEEE Transactions on Robotics.

[4]  Yaser Sheikh,et al.  Hand Keypoint Detection in Single Images Using Multiview Bootstrapping , 2017, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[5]  Bolei Zhou,et al.  Network Dissection: Quantifying Interpretability of Deep Visual Representations , 2017, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[6]  Junghyuk Ko,et al.  Design and fabrication of auxetic stretchable force sensor for hand rehabilitation , 2015 .

[7]  Jeffrey T. Clark,et al.  Digital analysis: Manual dexterity in Neanderthals , 2003, Nature.

[8]  J R Flanagan,et al.  Coming to grips with weight perception: Effects of grasp configuration on perceived heaviness , 2000, Perception & psychophysics.

[9]  Edward H. Adelson,et al.  Localization and manipulation of small parts using GelSight tactile sensing , 2014, IROS.

[10]  Chiara Bartolozzi,et al.  Robots with a sense of touch. , 2016, Nature materials.

[11]  Jeffrey M Yau,et al.  Neurophysiology of Tactile Perception : A Tribute to Steven Hsiao Feeling form : the neural basis of haptic shape perception , 2016 .

[12]  Sergey Levine,et al.  End-to-End Training of Deep Visuomotor Policies , 2015, J. Mach. Learn. Res..

[13]  J. Randall Flanagan,et al.  Coding and use of tactile signals from the fingertips in object manipulation tasks , 2009, Nature Reviews Neuroscience.

[14]  Yang Gao,et al.  Deep learning for tactile understanding from visual and haptic data , 2015, 2016 IEEE International Conference on Robotics and Automation (ICRA).

[15]  H. Bülthoff,et al.  Viewpoint Dependence in Visual and Haptic Object Recognition , 2001, Psychological science.

[16]  Zhenan Bao,et al.  Pursuing prosthetic electronic skin. , 2016, Nature materials.

[17]  M W Marzke,et al.  Precision grips, hand morphology, and tools. , 1997, American journal of physical anthropology.

[18]  Lorenzo Rosasco,et al.  Combining sensory modalities and exploratory procedures to improve haptic object recognition in robotics , 2016, 2016 IEEE-RAS 16th International Conference on Humanoid Robots (Humanoids).

[19]  H. Ross,et al.  Sensorimotor mechanisms in weight discrimination , 1984, Perception & psychophysics.

[20]  Antonio Torralba,et al.  Ieee Transactions on Pattern Analysis and Machine Intelligence 1 80 Million Tiny Images: a Large Dataset for Non-parametric Object and Scene Recognition , 2022 .

[21]  Geoffrey E. Hinton,et al.  Visualizing Data using t-SNE , 2008 .

[22]  Jonghwa Park,et al.  Fingertip skin–inspired microstructured ferroelectric skins discriminate static/dynamic pressure and temperature stimuli , 2015, Science Advances.

[23]  Michael S. Bernstein,et al.  ImageNet Large Scale Visual Recognition Challenge , 2014, International Journal of Computer Vision.

[24]  Tommaso D'Alessio,et al.  Measurement errors in the scanning of piezoresistive sensors arrays , 1999 .

[25]  Nitish V. Thakor,et al.  Prosthesis with neuromorphic multilayered e-dermis perceives touch and pain , 2018, Science Robotics.

[26]  Danica Kragic,et al.  The GRASP Taxonomy of Human Grasp Types , 2016, IEEE Transactions on Human-Machine Systems.

[27]  T. Bachmann Identification of spatially quantised tachistoscopic images of faces: How many pixels does it take to carry identity? , 1991 .

[28]  Gerald E. Loeb,et al.  Haptic feature extraction from a biomimetic tactile sensor: Force, contact location and curvature , 2011, 2011 IEEE International Conference on Robotics and Biomimetics.

[29]  Thomas Feix,et al.  Estimating thumb–index finger precision grip and manipulation potential in extant and fossil primates , 2015, Journal of The Royal Society Interface.

[30]  Paolo Dario,et al.  A tactile array sensor layered in an artificial skin , 1995, Proceedings 1995 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human Robot Interaction and Cooperative Robots.

[31]  R. Klatzky,et al.  Haptic perception: A tutorial , 2009, Attention, perception & psychophysics.

[32]  Véronique Perdereau,et al.  Tactile sensing in dexterous robot hands - Review , 2015, Robotics Auton. Syst..

[33]  Christopher G. Atkeson,et al.  Combining finger vision and optical tactile sensing: Reducing and handling errors while cutting vegetables , 2016, 2016 IEEE-RAS 16th International Conference on Humanoid Robots (Humanoids).

[34]  J. Napier The prehensile movements of the human hand. , 1956, The Journal of bone and joint surgery. British volume.

[35]  Benoit P. Delhaye,et al.  Simulating tactile signals from the whole hand with millisecond precision , 2017, Proceedings of the National Academy of Sciences.

[36]  Allison M. Okamura,et al.  An overview of dexterous manipulation , 2000, Proceedings 2000 ICRA. Millennium Conference. IEEE International Conference on Robotics and Automation. Symposia Proceedings (Cat. No.00CH37065).

[37]  R. Klatzky,et al.  Hand movements: A window into haptic object recognition , 1987, Cognitive Psychology.

[38]  R L Klatzky,et al.  Identifying objects by touch: An “expert system” , 1985, Perception & psychophysics.

[39]  Jian Sun,et al.  Deep Residual Learning for Image Recognition , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[40]  Ken Goldberg,et al.  Learning ambidextrous robot grasping policies , 2019, Science Robotics.

[41]  Giulio Sandini,et al.  An embedded artificial skin for humanoid robots , 2008, 2008 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems.

[42]  Helge J. Ritter,et al.  Distinguishing sliding from slipping during object pushing , 2016, 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[43]  Steven J. M. Jones,et al.  Circos: an information aesthetic for comparative genomics. , 2009, Genome research.

[44]  Peter Corke,et al.  Closing the Loop for Robotic Grasping: A Real-time, Generative Grasp Synthesis Approach , 2018, Robotics: Science and Systems.

[45]  Luca Antiga,et al.  Automatic differentiation in PyTorch , 2017 .