2½D visual servoing

We propose an approach to vision-based robot control, called 2 1/2 D visual servoing, which avoids the respective drawbacks of classical position-based and image-based visual servoing. Contrary to the position-based visual servoing, our scheme does not need any geometric three-dimensional model of the object. Furthermore and contrary to image-based visual servoing, our approach ensures the convergence of the control law in the whole task space. 2 1/2 D visual servoing is based on the estimation of the partial camera displacement from the current to the desired camera poses at each iteration of the control law. Visual features and data extracted from the partial displacement allow us to design a decoupled control law controlling the six camera DOFs. The robustness of our visual servoing scheme with respect to camera calibration errors is also analyzed: the necessary and sufficient conditions for local asymptotic stability are easily obtained. Then, due to the simple structure of the system, sufficient conditions for global asymptotic stability are established. Finally, experimental results with an eye-in-hand robotic system confirm the improvement in the stability and convergence domain of the 2 1/2 D visual servoing with respect to classical position-based and image-based visual servoing.

[1]  Lee E. Weiss,et al.  Dynamic sensor-based control of robots with visual feedback , 1987, IEEE Journal on Robotics and Automation.

[2]  Olivier Faugeras,et al.  Motion and Structure from Motion in a piecewise Planar Environment , 1988, Int. J. Pattern Recognit. Artif. Intell..

[3]  Patrick Rives,et al.  A new approach to visual servoing in robotics , 1992, IEEE Trans. Robotics Autom..

[4]  Claude Samson,et al.  Robot Control: The Task Function Approach , 1991 .

[5]  Bernard Espiau,et al.  Effect of Camera Calibration Errors on Visual Servoing in Robotics , 1993, ISER.

[6]  Takeo Kanade,et al.  Visual tracking of a moving target by a camera mounted on a robot: a combination of control and vision , 1993, IEEE Trans. Robotics Autom..

[7]  Peter K. Allen,et al.  Automated tracking and grasping of a moving object with a robotic hand-eye system , 1993, IEEE Trans. Robotics Autom..

[8]  Koichi Hashimoto,et al.  Visual Servoing: Real-Time Control of Robot Manipulators Based on Visual Sensory Feedback , 1993 .

[9]  Roger Mohr,et al.  Epipole and fundamental matrix estimation using virtual parallax , 1995, Proceedings of IEEE International Conference on Computer Vision.

[10]  François Chaumette,et al.  Compensation of abrupt motion changes in target tracking by visual servoing , 1995, Proceedings 1995 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human Robot Interaction and Cooperative Robots.

[11]  Peter I. Corke,et al.  A tutorial on visual servo control , 1996, IEEE Trans. Robotics Autom..

[12]  William J. Wilson,et al.  Relative end-effector control using Cartesian position based visual servoing , 1996, IEEE Trans. Robotics Autom..

[13]  Richard I. Hartley,et al.  In Defense of the Eight-Point Algorithm , 1997, IEEE Trans. Pattern Anal. Mach. Intell..

[14]  François Chaumette,et al.  Potential problems of stability and convergence in image-based and position-based visual servoing , 1997 .

[15]  Ehud Rivlin,et al.  Visual homing: Surfing on the epipoles , 1997, Block Island Workshop on Vision and Control.

[16]  Gregory D. Hager,et al.  A modular system for robust positioning using feedback from stereo vision , 1997, IEEE Trans. Robotics Autom..

[17]  Francois Chaumette,et al.  Potential problems of unstability and divergence in image-based and position-based visual servoing , 1999, 1999 European Control Conference (ECC).