Integration of camera and range sensors for 3D pose estimation in robot visual servoing

Range-vision sensor systems can incorporate range images or single point measurements. Research incorporating point range measurements has focused on the area of map generation for mobile robots. These systems can utilize the fact that the objects sensed tend to be large and planar. The approach presented in this paper fuses information obtained from a point range measurement with visual information to produce estimates of the relative 3D position and orientation of a small, non-planar object with respect to a robot end- effector. The paper describes a real-time sensor fusion system for performing dynamic visual servoing using a camera and a point laser range sensor. The system is based upon the object model reference approach. This approach, which can be used to develop multi-sensor fusion systems that fuse dynamic sensor data from diverse sensors in real-time, uses a description of the object to be sensed in order to develop a combined observation-dependency sensor model. The range- vision sensor system is evaluated in terms of accuracy and robustness. The results show that the use of a range sensor significantly improves the system performance when there is poor or insufficient camera information. The system developed is suitable for visual servoing applications, particularly robot assembly operations.

[1]  Mubarak Shah,et al.  Multi-sensor fusion: a perspective , 1990, Proceedings., IEEE International Conference on Robotics and Automation.

[2]  Bill Triggs,et al.  Model-based sonar localisation for mobile robots , 1994, Robotics Auton. Syst..

[3]  Peter I. Corke,et al.  A tutorial on visual servo control , 1996, IEEE Trans. Robotics Autom..

[4]  Paul S. Schenker Sensor Fusion: Spatial Reasoning and Scene Interpretation , 1988 .

[5]  Georges Giralt,et al.  An Integrated Navigation and Motion Control System for Autonomous Multisensory Mobile Robots , 1990, Autonomous Robot Vehicles.

[6]  Lee E. Weiss,et al.  Dynamic sensor-based control of robots with visual feedback , 1987, IEEE Journal on Robotics and Automation.

[7]  Rüdiger Dillmann,et al.  Real-time map refinement by use of sonar and active stereo-vision , 1995, Robotics Auton. Syst..

[8]  Hugh Durrant-Whyte,et al.  Integration, coordination, and control of multi-sensor robot systems , 1987 .

[9]  Scott W. Shaw,et al.  Microwave And Camera Sensor Fusion For The Shape Extraction Of Metallic 3D Space Objects , 1989, Optics East.

[10]  Alan Watt,et al.  Fundamentals of three-dimensional computer graphics , 1989 .

[11]  Myung Jin Chung,et al.  A method of acoustic landmark extraction for mobile robot navigation , 1996, IEEE Trans. Robotics Autom..

[12]  William J. Wilson,et al.  Relative end-effector control using Cartesian position based visual servoing , 1996, IEEE Trans. Robotics Autom..

[13]  Peter Corke,et al.  VISUAL CONTROL OF ROBOT MANIPULATORS – A REVIEW , 1993 .

[14]  Michael S. Black,et al.  Integration of multiple sensors to provide flexible control strategies , 1986, Proceedings. 1986 IEEE International Conference on Robotics and Automation.