Haptic discrimination of material properties by a robotic hand

One of the key aspects of understanding human intelligence is to investigate how humans interact with their environment. Performing articulated movement and manipulation tasks in a constantly changing environment, have proven more difficult than expected. The difficulties of robot manipulation are in part due to the unbalanced relation between vision and haptic sensing. Most robots are equipped with high resolution cameras, which images are processed by well established computer vision algorithms such as color segmentation, motion detection, edge detection, etc. However, the majority of robots have very limited haptic capabilities. This paper presents our attempt to overcome this difficulties by: (a) using a tendon driven robotic hand with rich dynamical movements and (b) covering the hand with a set of haptic sensors on the palm and the fingertips, the sensors are based on a simplified version of an artificial skin with strain gauges and PVDF (polyvinylidene fluoride) films. The results show that if the robotic hand actively explores different objects using the exploratory procedures: tapping and squeezing, material properties such as hardness and texture can be used to discriminate haptically between different objects.

[1]  R. Klatzky,et al.  Hand movements: A window into haptic object recognition , 1987, Cognitive Psychology.

[2]  Z. Pylyshyn,et al.  Vision and Action: The Control of Grasping , 1990 .

[3]  J. Kaas The functional organization of somatosensory cortex in primates. , 1993, Annals of anatomy = Anatomischer Anzeiger : official organ of the Anatomische Gesellschaft.

[4]  E. Bushnell,et al.  Motor development and the mind: the potential role of motor abilities as a determinant of aspects of perceptual development. , 1993, Child development.

[5]  Jorma Laaksonen,et al.  SOM_PAK: The Self-Organizing Map Program Package , 1996 .

[6]  Wenwei Yu,et al.  Mutual Adaptation in a Prosthetics Application , 2003, Embodied Artificial Intelligence.

[7]  M. Asada,et al.  Sensing Ability of Anthropomorphic Fingertip with Multi-Modal Sensors , 2003 .

[8]  L. Natale,et al.  Learning haptic representation of objects , 2004 .

[9]  Michèle Molina,et al.  Manual cyclical activity as an exploratory tool in neonates , 2004 .

[10]  Joscelyn N. Hoffmann,et al.  Meissner corpuscles and somatosensory acuity: the prehensile appendages of primates and elephants. , 2004, The anatomical record. Part A, Discoveries in molecular, cellular, and evolutionary biology.

[11]  Claudio Melchiorri,et al.  Modelling and identification of soft pads for robotic hands , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[12]  Claudio Melchiorri,et al.  Modelling and Controlling the Compliance of a Robotic Hand with Soft Finger-pads , 2005, Multi-point Interaction with Real and Virtual Objects.

[13]  Lorenzo Natale,et al.  Tapping into Touch , 2005 .

[14]  T. Striano,et al.  Haptic perception of material properties by 3-month-old infants , 2005 .

[15]  Giorgio Cannata,et al.  An embedded tactile and force sensor for robotic manipulation and grasping , 2005, 5th IEEE-RAS International Conference on Humanoid Robots, 2005..

[16]  A. Streri,et al.  The development of haptic abilities in very young infants: From perception to cognition , 2005 .

[17]  Minoru Asada,et al.  Anthropomorphic robotic soft fingertip with randomly distributed receptors , 2006, Robotics Auton. Syst..

[18]  Giorgio Cannata,et al.  Processing of Tactile/Force Measurements for a Fully Embedded Sensor , 2006, 2006 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems.