Robotic grasping based on efficient tracking and visual servoing using local feature descriptors

In service robotic applications, grasping daily objects is an essential requirement. In this context, object and obstacle detection are used to find the desired object and to plan an obstacle-free path for a robot to successfully manipulate the object. In this paper, we propose a high-speed object tracking method based on a window approach and a local feature descriptor called speeded-up robust features (SURF). Instead of tracking the object in full image, we search and match features in the window of attention that contains only the object. Therefore, the tracked interest points are more repeatable and robust against noise. The visual servo controller uses geometrical features that are computed directly from the set of interest points, which makes the method robust against the loss of features caused by occlusion or changes in the viewpoint. Furthermore, these features decouple the translations and rotations from the image Jacobian, and also keep the object inside the camera’s field of view. Various experiments with a robotic arm equipped with a monocular eye-in-hand camera demonstrate that objects can be grasped safely and in a stable manner in a cluttered environment using the proposed method.

[1]  François Chaumette,et al.  Potential problems of stability and convergence in image-based and position-based visual servoing , 1997 .

[2]  Francois Chaumette,et al.  Potential problems of unstability and divergence in image-based and position-based visual servoing , 1999, 1999 European Control Conference (ECC).

[3]  Luc Van Gool,et al.  SURF: Speeded Up Robust Features , 2006, ECCV.

[4]  Ashutosh Saxena,et al.  Robotic Grasping of Novel Objects using Vision , 2008, Int. J. Robotics Res..

[5]  Farrokh Janabi-Sharifi,et al.  Using scale-invariant feature points in visual servoing , 2004, SPIE Optics East.

[6]  Peter I. Corke,et al.  A new partitioned approach to image-based visual servo control , 2001, IEEE Trans. Robotics Autom..

[7]  Laxmidhar Behera,et al.  Visual motor control of a 7 DOF robot manipulator using a fuzzy SOM network , 2010, Intell. Serv. Robotics.

[8]  José Manuel Iñesta Quereda,et al.  Grasping the not-so-obvious: vision-based object handling for industrial applications , 2005, IEEE Robotics & Automation Magazine.

[9]  Günter Rudolph,et al.  Visual Servoing with Moments of SIFT Features , 2006, 2006 IEEE International Conference on Systems, Man and Cybernetics.

[10]  Chin-Su Kim,et al.  Robust visual servo control of robot manipulators with uncertain dynamics and camera parameters , 2010 .

[11]  Jae-Bok Song,et al.  Improvement of Feature-based Object Recognition Using Affine Transformation for Mobile Robot Navigation , 2008 .

[12]  Matthijs C. Dorst Distinctive Image Features from Scale-Invariant Keypoints , 2011 .