3-D hand & eye-vergence approaching visual servoing with Lyapunouv-stable pose tracking

In this paper, we focus on how to control the robot end-effector to track an object, meanwhile, to approach it with a suitable posture for grasping. We named it “Approaching Visual Servoing”. A proposed hand & eye-vergence dual control system is used to perform Approaching Visual Servoing, aiming at quick eye-tracking and stable hand servoing and approaching. This idea stems from hammerhead shark whose eyes turn to gaze at the target prey to be suited to triangulation, enhancing the ability to measure precisely the distance to the prey for catching it. This animal's visual tracking includes motion control by visual servoing and triangular eye vergence. Moreover, a 3-D pose tracking method that combines “1-Step GA (genetic algorithm)” and hand-motion feedforword compensation is proposed. Our approach differs from known tracking methods using optimization based on Taylor expansion, for it allows the proposed method not to be annoyed by how to sneak out of local minima. A convergence in time domain - whether the 3-D pose tracking error decrease to zero in a successively input images by video rate -, is discussed and verified through Lyapunov method. Both Lyapunouv-stable pose tracking and Approaching Visual Servoing are confirmed by experiments using a 7-link manipulator installed with two mobile cameras.

[1]  Wei Song,et al.  Hand & eye-vergence dual visual servoing to enhance observability and stability , 2009, 2009 IEEE International Conference on Robotics and Automation.

[2]  Ezio Malis,et al.  Improving vision-based control using efficient second-order minimization techniques , 2004, IEEE International Conference on Robotics and Automation, 2004. Proceedings. ICRA '04. 2004.

[3]  Peter I. Corke,et al.  A tutorial on visual servo control , 1996, IEEE Trans. Robotics Autom..

[4]  Selim Benhimane,et al.  Real-time image-based tracking of planes using efficient second-order minimization , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[5]  Gerd Hirzinger,et al.  Hierarchical Featureless Tracking for Position-Based 6-DoF Visual Servoing , 2006, 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[6]  François Chaumette,et al.  Point-based and region-based image moments for visual servoing of planar objects , 2005, IEEE Transactions on Robotics.

[7]  Wei Song,et al.  Stability / precision improvement of 6-DoF visual servoing by motion feedforward compensation and experimental evaluation , 2009, 2009 IEEE International Conference on Robotics and Automation.

[8]  Takeo Kanade,et al.  Visual tracking of a moving target by a camera mounted on a robot: a combination of control and vision , 1993, IEEE Trans. Robotics Autom..

[9]  C. S. George Lee,et al.  Adaptive image feature prediction and control for visual tracking with a hand-eye coordinated camera , 1990, IEEE Trans. Syst. Man Cybern..

[10]  Ruigang Yang,et al.  Model-based head pose tracking with stereovision , 2002, Proceedings of Fifth IEEE International Conference on Automatic Face Gesture Recognition.

[11]  Ren C. Luo,et al.  An adaptive robotic tracking system using optical flow , 1988, Proceedings. 1988 IEEE International Conference on Robotics and Automation.

[12]  Bruno Siciliano,et al.  Robot Force Control , 2000 .

[13]  E. Malis,et al.  2 1/2 D Visual Servoing , 1999 .