Overcoming occlusions in eye-in-hand visual search

In this paper we propose a method for handling persistent visual occlusions that disrupt visual tracking for eye-in-hand systems. This approach provides an efficient strategy for the robot to “look behind” the occlusion while respecting the robot's physical constraints. Specifically, we propose a decoupled search strategy combining a naïve pan tilt search with a sensor placement approach, to reduce the strategy's computational cost. We proceed by mapping limited environmental data into the robot configuration space and then planning within a constrained region. We use a particle filter to continuously estimate the target location, while our configuration-based cost function plans a goal location for the camera frame, taking into account robot singularity, self-collision and joint limit constraints. To validate our algorithm, we implemented it on an eye-in-hand robot system. Experimental results for various situations support the feasibility of our approach for quickly recovering fully occluded moving targets. Finally we discuss the implications of this approach to mobile robot platforms.

[1]  Chi-Yi Tsai,et al.  Robust visual tracking control system of a mobile robot based on a dual-Jacobian visual interaction model , 2009, Robotics Auton. Syst..

[2]  Jae-Bok Song,et al.  Path Planning for a Robot Manipulator based on Probabilistic Roadmap and Reinforcement Learning , 2007 .

[3]  Danica Kragic,et al.  Vision for Robotics , 2009, Found. Trends Robotics.

[4]  Domenico Prattichizzo,et al.  Keeping features in the field of view in eye-in-hand visual servoing: a switching approach , 2004, IEEE Transactions on Robotics.

[5]  Pradeep K. Khosla,et al.  Strategies for Increasing the Tracking Region of an Eye-in-Hand System by Singularity and Joint Limit Avoidance , 1995, Int. J. Robotics Res..

[6]  Nando de Freitas,et al.  A Bayesian exploration-exploitation approach for optimal online sensing and planning with a visually guided mobile robot , 2009, Auton. Robots.

[7]  Vincent Lepetit,et al.  Multimodal templates for real-time detection of texture-less objects in heavily cluttered scenes , 2011, 2011 International Conference on Computer Vision.

[8]  Chi-Yi Tsai,et al.  Robust Mobile Robot Visual Tracking Control System Using Self-Tuning Kalman Filter , 2007, 2007 International Symposium on Computational Intelligence in Robotics and Automation.

[9]  Mark H. Overmars,et al.  Roadmap-based motion planning in dynamic environments , 2005, IEEE Trans. Robotics.

[10]  Elizabeth A. Croft,et al.  Approximate Recursive Bayesian Filtering methods for robot visual search , 2011, 2011 IEEE International Conference on Robotics and Biomimetics.

[11]  François Chaumette,et al.  Visual servo control. I. Basic approaches , 2006, IEEE Robotics & Automation Magazine.

[12]  Vincent Lepetit,et al.  View-based Maps , 2010, Int. J. Robotics Res..