Identifying objects from hand configurations during in-hand exploration

In this work we use hand configuration and contact points during in-hand object exploration to identify the manipulated objects. Different contact points associated to an object shape can be represented in a latent space and lie on a lower dimensional non-linear manifold in the contact points space which is suitable for modelling and recognition. Associating and learning hand configurations to specific objects by means of Gaussian mixture models, later by identifying the hand configuration during the in-hand object exploration we can generate hypotheses of candidate objects to be identified. This process selects a set of the most probable objects from a database. The accumulated set of contact points (partial volume of the object shape) during the object in-hand exploration is matched to the set selected from the database (most probable candidate objects). Results are presented for human manipulation of objects, but this can also be applied to artificial hands, although we have not addressed the hand control, only the object identification.

[1]  David A. Cohn,et al.  Active Learning with Statistical Models , 1996, NIPS.

[2]  T. Asfour,et al.  A Framework for Visually guided Haptic Exploration with Five Finger Hands , 2007 .

[3]  Rui P. Rocha,et al.  Exploring information theory for vision-based volumetric mapping , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[4]  Danica Kragic,et al.  Spatio-temporal modeling of grasping actions , 2010, 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[5]  Heinz Wörn,et al.  Haptic object recognition using passive joints and haptic key features , 2010, 2010 IEEE International Conference on Robotics and Automation.

[6]  Susan J. Lederman,et al.  Intelligent exploration by the human hand , 1990 .

[7]  R. Klatzky,et al.  Hand movements: A window into haptic object recognition , 1987, Cognitive Psychology.

[8]  Jorge Dias,et al.  Extracting data from human manipulation of objects towards improving autonomous robotic grasping , 2012, Robotics Auton. Syst..

[9]  Kaspar Althoefer,et al.  Friction Estimation Based Object Surface Classification for Intelligent Manipulation , 2022 .

[10]  Jorge Dias,et al.  Probabilistic representation of 3D object shape by in-hand exploration , 2010, 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[11]  Hagit Shatkay,et al.  Breast Cancer Prognosis via Gaussian Mixture Regression , 2006, 2006 Canadian Conference on Electrical and Computer Engineering.

[12]  Thomas Feix,et al.  A comprehensive grasp taxonomy , 2009 .

[13]  A. Fagg,et al.  Learning Grasp Affordances Through Human Demonstration , 2008 .

[14]  Michael I. Jordan,et al.  Supervised learning from incomplete data via an EM approach , 1993, NIPS.

[15]  Aude Billard,et al.  On Learning, Representing, and Generalizing a Task in a Humanoid Robot , 2007, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[16]  Christian Cipriani,et al.  Roughness Encoding for Discrimination of Surfaces in Artificial Active-Touch , 2011, IEEE Transactions on Robotics.

[17]  Paul J. Besl,et al.  A Method for Registration of 3-D Shapes , 1992, IEEE Trans. Pattern Anal. Mach. Intell..