Recognizing the grasp intention from human demonstration

In human grasping, choices are made on the use of hand-parts even before a grasp is realized. The human associates these choices with end-functionality and is confident that the resulting grasp will be able to meet task requirements. We refer to these choices on the use of hand-parts underlying grasp formation as the grasp intention. Modeling the grasp intention offers a paradigm whereby decisions underlying grasp formation may be related to the functional properties of the realized grasp in terms of quantities which may be sensed/recognized or controlled. In this paper we model grasp intention as mix of oppositions between hand parts. Sub-parts of the hand acting in opposition to each other are viewed as a basis from which grasps are formed. We compute a set of such possible oppositions and determine the most likely combination from the raw information present in a demonstrated grasp. An intermediate representation of raw sensor data exposes interactions between elementary grasping surfaces. From this, the most likely combination of oppositions is inferred. Grasping experiments with humans show that the proposed approach is robust enough to correctly capture the intention in demonstrated grasps across a wide range of hand functionality. We propose a general method to interpret human grasp behavior in terms of opposition primitives.A primitive model consisting of 41 oppositions for the hand is defined.The most likely primitive combination is inferred from tactile and configuration data.An 87% recognition rate is achieved over a wide range of human grasp behavior.

[1]  Stefano Caselli,et al.  Grasp recognition in virtual reality for robot pregrasp planning by demonstration , 2006, Proceedings 2006 IEEE International Conference on Robotics and Automation, 2006. ICRA 2006..

[2]  Alexandre Bernardino,et al.  Modeling and planning high-level in-hand manipulation actions from human knowledge and active learning from demonstration , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[3]  M. Arbib,et al.  Opposition Space as a Structuring Concept for the Analysis of Skilled Hand Movements , 1986 .

[4]  Joe Jackson,et al.  Knowledge-based prehension: capturing human dexterity , 1988, Proceedings. 1988 IEEE International Conference on Robotics and Automation.

[5]  T. Hasegawa,et al.  A Decision Method for Placement of Tactile Elements on a Sensor Glove for the Recognition of Grasp Types , 2010, IEEE/ASME Transactions on Mechatronics.

[6]  M. Arbib Coordinated control programs for movements of the hand , 1985 .

[7]  Jianwei Zhang,et al.  Precision grasp synergies for dexterous robotic hands , 2013, 2013 IEEE International Conference on Robotics and Biomimetics (ROBIO).

[8]  Katsushi Ikeuchi,et al.  Toward automatic robot instruction from perception-recognizing a grasp from observation , 1993, IEEE Trans. Robotics Autom..

[9]  G. G. Stokes "J." , 1890, The New Yale Book of Quotations.

[10]  Jorge Dias,et al.  Extracting data from human manipulation of objects towards improving autonomous robotic grasping , 2012, Robotics Auton. Syst..

[11]  Danica Kragic,et al.  Visual recognition of grasps for human-to-robot mapping , 2008, 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[12]  Thomas Feix,et al.  A comprehensive grasp taxonomy , 2009 .

[13]  Katsushi Ikeuchi,et al.  A sensor fusion approach for recognizing continuous human grasping sequences using hidden Markov models , 2005, IEEE Transactions on Robotics.

[14]  Mark R. Cutkosky,et al.  On grasp choice, grasp models, and the design of hands for manufacturing tasks , 1989, IEEE Trans. Robotics Autom..

[15]  J. Randall Flanagan,et al.  Coding and use of tactile signals from the fingertips in object manipulation tasks , 2009, Nature Reviews Neuroscience.

[16]  Michael A. Arbib,et al.  Schema design and implementation of the grasp-related mirror neuron system , 2002, Biological Cybernetics.

[17]  Richard M. Murray,et al.  A Mathematical Introduction to Robotic Manipulation , 1994 .

[18]  Danica Kragic,et al.  Grasp Recognition for Programming by Demonstration , 2005, Proceedings of the 2005 IEEE International Conference on Robotics and Automation.

[19]  Gaurav S. Sukhatme,et al.  On the Development of EMG Control for a Prosthetic Hand , 1994 .

[20]  J. F. Soechting,et al.  Postural Hand Synergies for Tool Use , 1998, The Journal of Neuroscience.

[21]  M. Arbib,et al.  Infant grasp learning: a computational model , 2004, Experimental Brain Research.

[22]  Oliver Kroemer,et al.  Generalization of human grasping for multi-fingered robot hands , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[24]  Katsushi Ikeuchi,et al.  Robot task programming by human demonstration: mapping human grasps to manipulator grasps , 1994, Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS'94).

[25]  Aude Billard,et al.  Towards comprehensive capture of human grasping and manipulation skills , 2014 .

[26]  Amy J Bastian,et al.  Multidigit Movement Synergies of the Human Hand in an Unconstrained Haptic Exploration Task , 2008, The Journal of Neuroscience.

[27]  N. Kamakura,et al.  Patterns of static prehension in normal hands. , 1980, The American journal of occupational therapy : official publication of the American Occupational Therapy Association.

[28]  F. Lotti,et al.  How Far Is the Human Hand ? A Review on Anthropomorphic Robotic End-effectors , 2003 .

[29]  Matei T. Ciocarlie,et al.  Hand Posture Subspaces for Dexterous Robotic Grasping , 2009, Int. J. Robotics Res..

[30]  Aaron M. Dollar,et al.  Performance characteristics of anthropomorphic prosthetic hands , 2011, 2011 IEEE International Conference on Rehabilitation Robotics.

[31]  Thea Iberall,et al.  Human Prehension and Dexterous Robot Hands , 1997, Int. J. Robotics Res..

[32]  Thea Iberall,et al.  The nature of human prehension: Three dextrous hands in one , 1987, Proceedings. 1987 IEEE International Conference on Robotics and Automation.