Scaled Monocular SLAM for Walking People

In this paper we present a full-scaled real-time monocular SLAM using only a wearable camera. Assuming that the person is walking, the perception of the head oscillatory motion in the initial visual odometry estimate allows for the computation of a dynamic scale factor for static windows of N camera poses. Improving on this method we introduce a consistency test to detect non-walking situations and propose a sliding window approach to reduce the delay in the update of the scaled trajectory. We evaluate our approach experimentally on a unscaled visual odometry estimate obtained with a wearable camera along a path of 886 m. The results show a significant improvement respect to the initial unscaled estimate with a mean relative error of 0:91% over the total trajectory length.

[1]  Salah Sukkarieh,et al.  Removing scale biases and ambiguity from 6DoF monocular SLAM using inertial , 2008, 2008 IEEE International Conference on Robotics and Automation.

[2]  Olivier Stasse,et al.  MonoSLAM: Real-Time Single Camera SLAM , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[3]  Luis Miguel Bergasa,et al.  On combining visual SLAM and dense scene flow to increase the robustness of localization and mapping in dynamic environments , 2012, 2012 IEEE International Conference on Robotics and Automation.

[4]  Alejandro Rituerto,et al.  Adapting a real-time monocular visual SLAM from conventional to omnidirectional cameras , 2011, 2011 IEEE International Conference on Computer Vision Workshops (ICCV Workshops).

[5]  Hauke Strasdat,et al.  Scale Drift-Aware Large Scale Monocular SLAM , 2010, Robotics: Science and Systems.

[6]  Roland Siegwart,et al.  Fusion of IMU and Vision for Absolute Scale Estimation in Monocular SLAM , 2011, J. Intell. Robotic Syst..

[7]  David W. Murray,et al.  Applying Active Vision and SLAM to Wearables , 2005, ISRR.

[8]  Michel Dhome,et al.  Real-time vehicle global localisation with a single camera in dense urban areas: Exploitation of coarse 3D city models , 2010, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[9]  Josechu J. Guerrero,et al.  Full scaled 3D visual odometry from a single wearable omnidirectional camera , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[10]  Steven Mills,et al.  Correcting Scale Drift by Object Recognition in Single-Camera SLAM , 2013, IEEE Transactions on Cybernetics.

[11]  Daniel Cremers,et al.  Camera-based navigation of a low-cost quadrocopter , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[12]  Takeo Kanade,et al.  A Head-Wearable Short-Baseline Stereo System for the Simultaneous Estimation of Structure and Motion , 2011, MVA.

[13]  Steve Mann,et al.  Wearable Computing: A First Step Toward Personal Imaging , 1997, Computer.

[14]  Javier Civera,et al.  Inverse Depth Parametrization for Monocular SLAM , 2008, IEEE Transactions on Robotics.

[15]  Roland Siegwart,et al.  Absolute scale in structure from motion from a single vehicle mounted camera by exploiting nonholonomic constraints , 2009, 2009 IEEE 12th International Conference on Computer Vision.