Visual odometry using motion vectors from visual feature points

In recent years, location-based services and indoor positioning systems gained increasing importance for both, research and industry. Visual localization systems have the advantage of not being dependent on dedicated infrastructure and thus are especially interesting for navigation within buildings. While there are already approaches of using pre-recorded databases of reference images to obtain an absolute position for a given query image, suitable means to estimate the relative movement of pedestrians from an ego perspective video are still missing. This paper presents a novel visual odometry system for pedestrians. The user carries a mobile device while walking - the camera aims into the direction of walking. Using only the video stream as input, the system generates a two-dimensional trajectory, which describes the path traveled by the user. Both, the user's current heading as well as the walking direction are estimated based on the movement of visual feature points in successive video frames. In order to assess the accuracy of the system, it is evaluated in three different scenarios (indoors in an university building, in an urban area and in a city park). Not relying on reference points (for instance provided by a database, which references visual feature points with geo-data), the error accumulates with distance traveled. After a walked distance of 100 meters, the average error lies between 4.6 and 13.9 meters (depending on the scenario). Consequently, the system is a promising approach for visual odometry, which can be used in conjunction with existing absolute visual positioning systems or as a core part of a future SLAM (simultaneous localization and mapping) system.

[1]  Larry H. Matthies,et al.  Two years of Visual Odometry on the Mars Exploration Rovers , 2007, J. Field Robotics.

[2]  Chadly Marouane,et al.  Indoor positioning using smartphone camera , 2011, 2011 International Conference on Indoor Positioning and Indoor Navigation.

[3]  Ilja Radusch,et al.  External visual positioning system for enclosed carparks , 2014, 2014 11th Workshop on Positioning, Navigation and Communication (WPNC).

[4]  Luc Van Gool,et al.  SURF: Speeded Up Robust Features , 2006, ECCV.

[5]  Corina Kim Schindhelm,et al.  Evaluating SLAM Approaches for Microsoft Kinect , 2012, ICWMC 2012.

[6]  Marco Maier,et al.  Visual positioning systems — An extension to MoVIPS , 2014, 2014 International Conference on Indoor Positioning and Indoor Navigation (IPIN).

[7]  James R. Bergen,et al.  Visual odometry , 2004, Proceedings of the 2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2004. CVPR 2004..

[8]  Roland Siegwart,et al.  Monocular Vision for Long‐term Micro Aerial Vehicle State Estimation: A Compendium , 2013, J. Field Robotics.

[9]  Hans P. Moravec Obstacle avoidance and navigation in the real world by a seeing robot rover , 1980 .

[10]  Paolo Pirjanian,et al.  The vSLAM Algorithm for Robust Localization and Mapping , 2005, Proceedings of the 2005 IEEE International Conference on Robotics and Automation.

[11]  Roland Siegwart,et al.  Vision based MAV navigation in unknown and unstructured environments , 2010, 2010 IEEE International Conference on Robotics and Automation.

[12]  Anas Al-Nuaimi,et al.  Mobile Visual Location Recognition , 2013 .

[13]  Friedrich Fraundorfer,et al.  Visual Odometry Part I: The First 30 Years and Fundamentals , 2022 .

[14]  Daniel Cremers,et al.  Accurate Figure Flying with a Quadrocopter Using Onboard Visual and Inertial Sensing , 2012 .

[15]  G. Klein,et al.  Parallel Tracking and Mapping for Small AR Workspaces , 2007, 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality.

[16]  Y. Hagiwara,et al.  Indoor human navigation system on smartphones using view-based navigation , 2012, 2012 12th International Conference on Control, Automation and Systems.

[17]  Larry H. Matthies,et al.  Visual odometry on the Mars Exploration Rovers , 2005, 2005 IEEE International Conference on Systems, Man and Cybernetics.