An Omni-directional vSLAM based on spherical camera model and 3D modeling

This paper presents an efficient Omni-directional Visual Simultaneous Localization and Mapping (vSLAM) algorithm based on spherical camera model and 3D modeling. In the paper, the robot has the ability of Omni-directional vision, which makes the algorithm more adaptive in an unknown environment. To get spherical panoramic images, we choose the panoramic image acquisition and mosaic equipment (divergent camera cluster). The improved SURF on spherical image, is adopted for feature extraction and matching. According to the theory of multiple view geometry of the spherical camera model, the 3D modeling is conducted for the surrounding environment. By using the feature points with high robustness, the location and pose of the robot can be estimated. In the process of system updating, the particle filter combined with Kalman filter is used for it can perform well in a complex environment. The results of numerical simulations and experiments have been included in this paper to verify the performance of the proposed approach.

[1]  H. Najjaran,et al.  Monocular vSLAM using a novel Rao-Blackwellized particle filter , 2010, 2010 IEEE/ASME International Conference on Advanced Intelligent Mechatronics.

[2]  Naokazu Yokoya,et al.  Camera parameter estimation from a long image sequence by tracking markers and natural features , 2004, Systems and Computers in Japan.

[3]  Zhengyou Zhang,et al.  A Flexible New Technique for Camera Calibration , 2000, IEEE Trans. Pattern Anal. Mach. Intell..

[4]  Nan Zhang,et al.  Computing Optimised Parallel Speeded-Up Robust Features (P-SURF) on Multi-Core Processors , 2010, International Journal of Parallel Programming.

[5]  Yonghuai Liu,et al.  Eliminating false matches for the projective registration of free-form surfaces with small translational motions , 2005, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[6]  Homayoun Najjaran,et al.  Development of Visual Simultaneous Localization and Mapping (VSLAM) for a Pipe Inspection Robot , 2007, 2007 International Symposium on Computational Intelligence in Robotics and Automation.

[7]  Roland Siegwart,et al.  A Flexible Technique for Accurate Omnidirectional Camera Calibration and Structure from Motion , 2006, Fourth IEEE International Conference on Computer Vision Systems (ICVS'06).

[8]  Sabry F. El-Hakim,et al.  Detailed 3D reconstruction of large-scale heritage sites with integrated techniques , 2004, IEEE Computer Graphics and Applications.

[9]  Li Zhang,et al.  Accurate pose and location estimation of uncalibrated camera in urban area , 2009, 2009 IEEE International Geoscience and Remote Sensing Symposium.

[10]  Ying Wang,et al.  Evaluating 3D-2D correspondences for accurate camera pose estimation from a single image , 2003, SMC'03 Conference Proceedings. 2003 IEEE International Conference on Systems, Man and Cybernetics. Conference Theme - System Security and Assurance (Cat. No.03CH37483).

[11]  Paolo Pirjanian,et al.  The vSLAM Algorithm for Robust Localization and Mapping , 2005, Proceedings of the 2005 IEEE International Conference on Robotics and Automation.

[12]  Luis Miguel Bergasa,et al.  Real-time hierarchical stereo Visual SLAM in large-scale environments , 2010, Robotics Auton. Syst..

[13]  Xu Wenli,et al.  Pose estimation problem in computer vision , 1993, Proceedings of TENCON '93. IEEE Region 10 International Conference on Computers, Communications and Automation.

[14]  Hugh F. Durrant-Whyte,et al.  Simultaneous localization and mapping: part I , 2006, IEEE Robotics & Automation Magazine.

[15]  Davide Scaramuzza,et al.  Omnidirectional Vision: From Calibration to Root Motion Estimation , 2007 .

[16]  K. Madhava Krishna,et al.  On-line convex optimization based solution for mapping in VSLAM , 2008, 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[17]  Roland Siegwart,et al.  Automatic detection of checkerboards on blurred and distorted images , 2008, 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems.