Dynamic Omnidirectional Vision Localization Using a Beacon Tracker Based on Particle Filter

Autonomous navigation is of primary importance in applications involving the usage of Autonomous Guided Vehicles (AGVs). Vision-based navigation systems provide an interesting option for both indoor and outdoor navigation as they can also be used in environments without an external supporting infrastructure for the navigation, which is unlike GPS, for example. However, the environment has to contain some natural or artificial features that can be observed with the vision system, and these features have to have some relationship to spatial locations in the navigation environment (Cao, 2001). The omnidirectional camera system produces a spherical field of view of an environment. This is particularly useful in vision-based navigation systems as all the images, provided by the camera system, contain the same information, independent of the rotation of the robot in the direction of the optical axis of the camera. This makes the computed image features more suitable for localization and navigation purposes (Hrabar & Sukhatme, 2003; Hampton et al., 2004). The methods proposed have been developed for vision-based navigation of Autonomous Ground Vehicles which utilize an omni-directional camera system as the vision sensor. The complete vision-based navigation system has also been implemented, including the omni-directional color camera system, image processing algorithms, and the navigation algorithms. The actual navigation system, including the camera system and the algorithms, has been developed. The aim is to provide a robust platform that can be utilized both in indoor and outdoor AGV applications (Cauchois et al., 2005; Sun et al., 2004). The fisheye lens is one of the most efficient ways to establish an omnidirectional vision system. The structure of the fisheye lens is relatively dense and well-knit unlike the structure of reflector lenses which consist of two parts and are fragile. (Li et al., 2006; Ying et al., 2006). Omnidirectional vision (omni-vision) holds promise of various applications. We use a fisheye lens upwards with the view angle of 185° to build the omni-directional vision system. Although fisheye lens takes the advantage of an extremely wide angle of view, there is an inherent distortion in the fisheye image which must be rectified to recover the original image. An approach for geometric restoration of omni-vision images has to be considered since an inherent distortion exists. The mapping between image coordinates and physical space parameters of the targets can be obtained by means of the imaging principle on the fisheye lens. Firstly a method for calibrating the omni-vision system is proposed. The method relies on the utilities of a cylinder on which inner wall including several straight O pe n A cc es s D at ab as e w w w .ite ch on lin e. co m

[1]  Cyril Cauchois,et al.  Robotic assistance: an automatic wheelchair tracking and following functionality by omnidirectional vision , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[2]  Chunru Wan,et al.  Classification using support vector machines with graded resolution , 2005, 2005 IEEE International Conference on Granular Computing.

[3]  Fan Zhang,et al.  Adaptive Randomized Hough Transform for Circle Detection using Moving Window , 2006, 2006 International Conference on Machine Learning and Cybernetics.

[4]  Christian Bräuer-Burchardt,et al.  A new algorithm to correct fish-eye- and strong wide-angle-lens-distortion from single images , 2001, Proceedings 2001 International Conference on Image Processing (Cat. No.01CH37205).

[5]  Jianhua Wang,et al.  A New Calibration Model and Method of Camera Lens Distortion , 2006, 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[6]  N. Oudjane,et al.  Recent particle filter applied to terrain navigation , 2000, Proceedings of the Third International Conference on Information Fusion.

[7]  Qixin Cao,et al.  An object tracking and global localization method using omnidirectional vision system , 2004, Fifth World Congress on Intelligent Control and Automation (IEEE Cat. No.04EX788).

[8]  Sing Bing Kang,et al.  Parameter-Free Radial Distortion Correction with Center of Distortion Estimation , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[9]  Gaurav S. Sukhatme,et al.  Omnidirectional vision for an autonomous helicopter , 2003, 2003 IEEE International Conference on Robotics and Automation (Cat. No.03CH37422).

[10]  Jin Cao,et al.  Omnivision-based autonomous mobile robotic platform , 2001, SPIE Optics East.

[11]  Jing Yang,et al.  A parallel SVM training algorithm on large-scale classification problems , 2005, 2005 International Conference on Machine Learning and Cybernetics.

[12]  Jae Wook Jeon,et al.  A Real-Time Object Tracking System Using a Particle Filter , 2006, 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[13]  Zuoliang Cao,et al.  Omni-directional Vision Localization Based on Particle Filter , 2007, Fourth International Conference on Image and Graphics (ICIG 2007).

[14]  Emanuele Menegatti,et al.  Omnidirectional vision scan matching for robot localization in dynamic environments , 2006, IEEE Transactions on Robotics.

[15]  C. Ishii,et al.  An image conversion algorithm from fish eye image to perspective image for human eyes , 2003, Proceedings 2003 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM 2003).

[16]  Blace C. Albert,et al.  An autonomous tracked vehicle with omnidirectional sensing , 2004 .

[17]  Huang Tianshu,et al.  The real-time image processing based on DSP , 2005, 2005 9th International Workshop on Cellular Neural Networks and Their Applications.

[18]  Shigang Li,et al.  Full-View Spherical Image Camera , 2006, 18th International Conference on Pattern Recognition (ICPR'06).

[19]  Hongbin Zha,et al.  Using Sphere Images for Calibrating Fisheye Cameras under the Unified Imaging Model of the Central Catadioptric and Fisheye Cameras , 2006, 18th International Conference on Pattern Recognition (ICPR'06).