Extraction of grasp-related features by human dual-hand object exploration

We consider the problem of objects exploration for grasping purposes, specifically in cases where vision based methods are not applicable. A novel dual-hand object exploration method is proposed that takes benefits from a human demonstration to enrich knowledge about an object. The user handles an object freely using both hands, without restricting the object pose. A set of grasp-related features obtained during exploration is demonstrated and utilized to generate grasp oriented bounding boxes that are basis for pre-grasp hypothesis. We believe that such exploration done in a natural and user friendly way creates important link between an operator intention and a robot action.

[1]  Mark R. Cutkosky,et al.  On grasp choice, grasp models, and the design of hands for manufacturing tasks , 1989, IEEE Trans. Robotics Autom..

[2]  Jianwei Zhang,et al.  Learning of demonstrated grasping skills by stereoscopic tracking of human head configuration , 2006, Proceedings 2006 IEEE International Conference on Robotics and Automation, 2006. ICRA 2006..

[3]  Antonio Morales,et al.  Vision-based three-finger grasp synthesis constrained by hand geometry , 2006, Robotics Auton. Syst..

[4]  Tamim Asfour,et al.  Programming by demonstration: dual-arm manipulation tasks for humanoid robots , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[5]  S. Gruber,et al.  Robot hands and the mechanics of manipulation , 1987, Proceedings of the IEEE.

[6]  Rüdiger Dillmann,et al.  Teaching and learning of robot tasks via observation of human performance , 2004, Robotics Auton. Syst..

[7]  Danica Kragic,et al.  Selection of robot pre-grasps using box-based shape approximation , 2008, 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[8]  Jan Wikander,et al.  A sub 1000 Euro robot hand for grasping : Design, simulation and evaluation , 2008 .

[9]  Rüdiger Dillmann,et al.  Integration of tactile sensors in a programming by demonstration system , 2001, Proceedings 2001 ICRA. IEEE International Conference on Robotics and Automation (Cat. No.01CH37164).

[10]  David Zeltzer,et al.  A survey of glove-based input , 1994, IEEE Computer Graphics and Applications.

[11]  Jirí Hrebícek,et al.  Solving Problems in Scientific Computing Using Maple and MATLAB® , 2004, Springer Berlin Heidelberg.

[12]  W. Gander Least squares fit of point clouds , 1998 .

[13]  Masanao Koeda,et al.  Shape recognition and grasping by robotic hands with soft fingers and omnidirectional camera , 2008, 2008 IEEE International Conference on Robotics and Automation.

[14]  Oliver Brock,et al.  Manipulating articulated objects with interactive perception , 2008, 2008 IEEE International Conference on Robotics and Automation.

[15]  Rüdiger Dillmann,et al.  Using multiple probabilistic hypothesis for programming one and two hand manipulation by demonstration , 2003, Proceedings 2003 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2003) (Cat. No.03CH37453).

[16]  Danica Kragic,et al.  Minimum volume bounding box decomposition for shape approximation in robot grasping , 2008, 2008 IEEE International Conference on Robotics and Automation.

[17]  Roland Siegwart,et al.  Robot learning from demonstration , 2004, Robotics Auton. Syst..

[18]  Giulio Sandini,et al.  Internal models of reaching and grasping , 2007, Adv. Robotics.

[19]  Ashutosh Saxena,et al.  Robotic Grasping of Novel Objects using Vision , 2008, Int. J. Robotics Res..

[20]  Helge J. Ritter,et al.  Situated robot learning for multi-modal instruction and imitation of grasping , 2004, Robotics Auton. Syst..