Probabilistic Classification of Grasping Behaviours Using Visuo-Haptic Perception

This paper presents a novel approach to visuo-haptic perception of grasping/manipulative tasks. The proposed approach is founded on a hierarchical Bayesian model which integrates the visual information with the haptic data to reach a reasonable percept of what is happening in grasping tasks. The primary goal of the approach is to identify what type of grasping behaviour is being performed by the human subject, and as a secondary goal, to simultaneously assess the quality of the respective grasping behaviour. For a simple set of grasping behaviours defined in this paper, preliminary experimental results indicate that the proposed approach could result in a robust and efficient perception of grasp behaviours.

[1]  Jorge Dias,et al.  Symbolic level generalization of in-hand manipulation tasks from human demonstrations using tactile , 2010 .

[2]  Heinrich H. Bülthoff,et al.  Cross-modal perception of actively explored objects , 2003 .

[3]  J. F. Soechting,et al.  Approaches to the study of haptic sensing. , 2005, Journal of neurophysiology.

[4]  P. Cz. Handbuch der physiologischen Optik , 1896 .

[5]  Hong Zhang,et al.  A constraint-satisfaction approach for 3D vision/touch-based object recognition , 1995, Proceedings 1995 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human Robot Interaction and Cooperative Robots.

[6]  Ruzena Bajcsy,et al.  Learning visuo-tactile coordination in robotic systems , 1995, Proceedings of 1995 IEEE International Conference on Robotics and Automation.

[7]  N.I. Rafla,et al.  Vision-taction integration for surface representation , 1990, 1990 IEEE International Conference on Systems Engineering.

[8]  Gregory D. Hager,et al.  Preliminary results on grasping with vision and touch , 1996, Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems. IROS '96.