Strategies for Object Manipulation using Foveal and Peripheral Vision

Visual feedback is used extensively in robotics and application areas range from human-robot interaction to object grasping and manipulation. There have been a number of examples of how to develop different components required by the above applications and very few general vision systems capable of performing a variety of tasks. In this paper, we concentrate on vision strategies for robotic manipulation tasks in a domestic environment. In particular, given fetch-and-carry type of tasks, the issues related to the whole detect-approach-grasp loop are considered. We deal with the problem of flexibility and robustness by using monocular and binocular visual cues and their integration. We demonstrate real-time disparity estimation, object recognition and pose estimation. We also show how a combination of foveal and peripheral vision system can be combined in order to provide a wide, low resolution and narrow, high resolution field of view.

[1]  Dorin Comaniciu,et al.  Real-time tracking of non-rigid objects using mean shift , 2000, Proceedings IEEE Conference on Computer Vision and Pattern Recognition. CVPR 2000 (Cat. No.PR00662).

[2]  David G. Lowe,et al.  Object recognition from local scale-invariant features , 1999, Proceedings of the Seventh IEEE International Conference on Computer Vision.

[3]  Gerd Hirzinger,et al.  Real-time visual tracking of 3D objects with dynamic handling of occlusion , 1997, Proceedings of International Conference on Robotics and Automation.

[4]  Dana H. Ballard,et al.  Animate Vision , 1991, Artif. Intell..

[5]  Danica Kragic,et al.  Vision and tactile sensing for real world tasks , 2003, 2003 IEEE International Conference on Robotics and Automation (Cat. No.03CH37422).

[6]  Jan-Olof Eklundh,et al.  Computational Vision and Active Perception Laboratory, CVAP , 1998 .

[7]  Danica Kragic,et al.  Integration of Model-based and Model-free Cues for Visual Object Tracking in 3D , 2005, Proceedings of the 2005 IEEE International Conference on Robotics and Automation.

[8]  Markus Vincze,et al.  An Integrated Framework for Robust Real-Time 3D Object Tracking , 1999, ICVS.

[9]  Roberto Cipolla,et al.  Real-Time Visual Tracking of Complex Structures , 2002, IEEE Trans. Pattern Anal. Mach. Intell..

[10]  Hans-Hellmut Nagel,et al.  Model-based object tracking in monocular image sequences of road traffic scenes , 1993, International Journal of Computer 11263on.

[11]  Danica Kragic,et al.  Combination of foveal and peripheral vision for object recognition and pose estimation , 2004, IEEE International Conference on Robotics and Automation, 2004. Proceedings. ICRA '04. 2004.

[12]  Jan-Olof Eklundh,et al.  Real-Time Epipolar Geometry Estimation of Binocular Stereo Heads , 2002, IEEE Trans. Pattern Anal. Mach. Intell..

[13]  Mårten Björkman Real-Time Motion and Stereo Cues for Active Visual Observers , 2002 .

[14]  Aaron Sloman,et al.  Evolvable Biologically Plausible Visual Architectures , 2001, BMVC.

[15]  David W. Murray,et al.  A modular head/eye platform for real-time reactive vision Mechatronics , 1993 .

[16]  Danica Kragic,et al.  Confluence of parameters in model based tracking , 2003, 2003 IEEE International Conference on Robotics and Automation (Cat. No.03CH37422).

[17]  Steve Vinoski,et al.  CORBA: integrating diverse applications within distributed heterogeneous environments , 1997, IEEE Commun. Mag..

[18]  Peter I. Corke,et al.  A tutorial on visual servo control , 1996, IEEE Trans. Robotics Autom..

[19]  Danica Kragic,et al.  Weak models and cue integration for real-time tracking , 2002, Proceedings 2002 IEEE International Conference on Robotics and Automation (Cat. No.02CH37292).

[20]  Jan-Olof Eklundh,et al.  Foveated Figure-Ground Segmentation and Its Role in Recognition , 2005, BMVC.