Precise Bearing Angle Measurement Based on Omnidirectional Conic Sensor and Defocusing

Recent studies on multi-robot localization have shown that the uncertainty of robot location may be considerably reduced by optimally fusing odometry and the relative angles of sight (bearing) among the team members. However, the latter requires the capability for each robot of detecting the other members up to large distances and wide field of view. Furthermore, robustness and precision in estimating the relative angle of sight is of high importance. In this paper we show how all of the these requirements may be achieved by employing an omnidirectional sensor made up of a conic mirror and a simple webcam. We use different colored lights to distinguish the robots and optical defocusing to identify the lights. We show that defocusing increases the detection range up to several meters, compensating the decay of resolution related to the omnidirectional view, without losing robustness and precision. To allow a real time implementation of light tracking, we use a recent “tree-based union find” technique for color segmentation and region merging. We also present a self-calibration technique based on an Extended Kalman Filter to derive the intrinsic parameters of the robot-sensor system. The performance of the approach is shown through experimental results.

[1]  Tom,et al.  Central Panoramic Cameras: Design and Geometry Central Panoramic Cameras: Design and Geometry Tomm a S Svoboda, Tomm a S Pajdla and Vv Aclav Hlavv a C , 1998 .

[2]  Shree K. Nayar,et al.  Catadioptric omnidirectional camera , 1997, Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[3]  Gregory Dudek,et al.  Multi-Robot Exploration of an Unknown Environment, Efficiently Reducing the Odometry Error , 1997, IJCAI.

[4]  George A. Bekey,et al.  On autonomous robots , 1998, The Knowledge Engineering Review.

[5]  Ryo Kurazume,et al.  Cooperative positioning with multiple robots , 1994, Proceedings of the 1994 IEEE International Conference on Robotics and Automation.

[6]  Sanjiv Singh,et al.  Analysis and Design of Panoramic Stereo Vision Using Equi-Angular Pixel Cameras , 1999 .

[7]  Stergios I. Roumeliotis,et al.  Distributed multirobot localization , 2002, IEEE Trans. Robotics Autom..

[8]  Gregory Dudek,et al.  On Multiagent Exploration , 1998 .

[9]  M. Srinivasan,et al.  Reflective surfaces for panoramic imaging. , 1997, Applied optics.

[10]  Manuela M. Veloso,et al.  Fast and inexpensive color image segmentation for interactive robots , 2000, Proceedings. 2000 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2000) (Cat. No.00CH37113).

[11]  Yasushi Yagi,et al.  Map based navigation of the mobile robot using omnidirectional image sensor COPIS , 1992, Proceedings 1992 IEEE International Conference on Robotics and Automation.

[12]  Christiaan J. J. Paredis,et al.  Heterogeneous Teams of Modular Robots for Mapping and Exploration , 2000, Auton. Robots.