Perception-based search and manipulation in a semi-structured environment

In this paper we present two new approaches developed in the context of autonomous search for localization and manipulation of books in a semi-structured environment, like a library. Search for books is done by means of visual information which is also integrated with force sensing for book manipulation. Interest regions are located using a local segmentation algorithm with automatic threshold selection. From the extracted visual features, we have implemented a fast tracking method by means of model-based matching. We also apply some recognition techniques in order to find the desired book and to guide the gripper over it. Grasping is finally done by integrating visual and force perception into a global hybrid control law. We have applied the implemented algorithms to the UJI librarian robot and results show the robustness of the implemented search algorithm as well as its fast performance. With respect to the grasping system, vision and force coupling are shown to be an adequate combination of sensors for book manipulation

[1]  Shin'ichi Yuta,et al.  Remote book browsing system using a mobile manipulator , 2003, 2003 IEEE International Conference on Robotics and Automation (Cat. No.03CH37422).

[2]  Angel P. del Pobil,et al.  Recent progress in the UJI librarian robot , 2004, 2004 IEEE International Conference on Systems, Man and Cybernetics (IEEE Cat. No.04CH37583).

[3]  Raúl Marín,et al.  Automatic speech recognition to teleoperate a robot via Web , 2002, IEEE/RSJ International Conference on Intelligent Robots and Systems.

[4]  David G. Lowe,et al.  Robust model-based motion tracking through the integration of search and estimation , 1992, International Journal of Computer Vision.

[5]  Gregory S. Chirikjian,et al.  A robotic library system for an off-site shelving facility , 2002, Proceedings 2002 IEEE International Conference on Robotics and Automation (Cat. No.02CH37292).