OpenRatSLAM: an open source brain-based SLAM system

RatSLAM is a navigation system based on the neural processes underlying navigation in the rodent brain, capable of operating with low resolution monocular image data. Seminal experiments using RatSLAM include mapping an entire suburb with a web camera and a long term robot delivery trial. This paper describes OpenRatSLAM, an open-source version of RatSLAM with bindings to the Robot Operating System framework to leverage advantages such as robot and sensor abstraction, networking, data playback, and visualization. OpenRatSLAM comprises connected ROS nodes to represent RatSLAM’s pose cells, experience map, and local view cells, as well as a fourth node that provides visual odometry estimates. The nodes are described with reference to the RatSLAM model and salient details of the ROS implementation such as topics, messages, parameters, class diagrams, sequence diagrams, and parameter tuning strategies. The performance of the system is demonstrated on three publicly available open-source datasets.

[1]  Donald E. Knuth,et al.  A Generalization of Dijkstra's Algorithm , 1977, Inf. Process. Lett..

[2]  Maja Pantic,et al.  Simple But Effective Personal Localisation Using Computer Vision , 2009 .

[3]  Robert C. Bolles,et al.  Outdoor Mapping and Navigation Using Stereo Vision , 2006, ISER.

[4]  Kurt Konolige,et al.  FrameSLAM: From Bundle Adjustment to Real-Time Visual Mapping , 2008, IEEE Transactions on Robotics.

[5]  B L McNaughton,et al.  Path Integration and Cognitive Mapping in a Continuous Attractor Neural Network Model , 1997, The Journal of Neuroscience.

[6]  David G. Lowe,et al.  Object recognition from local scale-invariant features , 1999, Proceedings of the Seventh IEEE International Conference on Computer Vision.

[7]  Tom Duckett,et al.  A Minimalistic Approach to Appearance-Based Visual SLAM , 2008, IEEE Transactions on Robotics.

[8]  Gordon Wyeth,et al.  CAT-SLAM: probabilistic localisation and mapping using a continuous appearance-based trajectory , 2012, Int. J. Robotics Res..

[9]  Paul Newman,et al.  FAB-MAP: Probabilistic Localization and Mapping in the Space of Appearance , 2008, Int. J. Robotics Res..

[10]  Paul Newman,et al.  Navigating, Recognizing and Describing Urban Spaces With Vision and Lasers , 2009, Int. J. Robotics Res..

[11]  Niko Sünderhauf,et al.  Towards a robust back-end for pose graph SLAM , 2012, 2012 IEEE International Conference on Robotics and Automation.

[12]  Luc Van Gool,et al.  SURF: Speeded Up Robust Features , 2006, ECCV.

[13]  Michael Milford Robot Navigation from Nature - Simultaneous Localisation, Mapping, and Path Planning based on Hippocampal Models , 2008, Springer Tracts in Advanced Robotics.

[14]  Gordon Wyeth,et al.  Single camera vision-only SLAM on a suburban road network , 2008, 2008 IEEE International Conference on Robotics and Automation.

[15]  Olivier Stasse,et al.  MonoSLAM: Real-Time Single Camera SLAM , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[16]  Lindsay Kleeman,et al.  Robust Appearance Based Visual Route Following for Navigation in Large-scale Outdoor Environments , 2009, Int. J. Robotics Res..

[17]  T. Hafting,et al.  Microstructure of a spatial map in the entorhinal cortex , 2005, Nature.

[18]  François Michaud,et al.  Memory management for real-time appearance-based loop closure detection , 2011, 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[19]  David Ball,et al.  iRat: Intelligent Rat Animat Technology , 2010, ICRA 2010.

[20]  Michael Milford,et al.  Towards Brain-based Sensor Fusion for Navigating Robots , 2012, ICRA 2012.

[21]  Ian D. Reid,et al.  Vast-scale Outdoor Navigation Using Adaptive Relative Bundle Adjustment , 2010, Int. J. Robotics Res..

[22]  Hauke Strasdat,et al.  Scale Drift-Aware Large Scale Monocular SLAM , 2010, Robotics: Science and Systems.

[23]  Winston Churchill,et al.  The New College Vision and Laser Data Set , 2009, Int. J. Robotics Res..

[24]  Paul Newman,et al.  Appearance-only SLAM at large scale with FAB-MAP 2.0 , 2011, Int. J. Robotics Res..

[25]  Niko Sünderhauf,et al.  Beyond RatSLAM: Improvements to a biologically inspired SLAM system , 2010, 2010 IEEE 15th Conference on Emerging Technologies & Factory Automation (ETFA 2010).

[26]  Paul Newman,et al.  Highly scalable appearance-only SLAM - FAB-MAP 2.0 , 2009, Robotics: Science and Systems.

[27]  Morgan Quigley,et al.  ROS: an open-source Robot Operating System , 2009, ICRA 2009.

[28]  Gordon Wyeth,et al.  Persistent Navigation and Mapping using a Biologically Inspired SLAM System , 2010, Int. J. Robotics Res..

[29]  Gordon Wyeth,et al.  Aerial SLAM with a single camera using visual expectation , 2011, 2011 IEEE International Conference on Robotics and Automation.

[30]  James J DiCarlo,et al.  A rodent model for the study of invariant visual object recognition , 2009, Proceedings of the National Academy of Sciences.

[31]  Gordon Wyeth,et al.  Mapping a Suburb With a Single Camera Using a Biologically Inspired SLAM System , 2008, IEEE Transactions on Robotics.

[32]  David Ball,et al.  A rat in the browser , 2011, ICRA 2011.

[33]  Janet Wiles,et al.  Solving Navigational Uncertainty Using Grid Cells on Robots , 2010, PLoS Comput. Biol..

[34]  Zachary Dodds,et al.  Visual navigation: image profiles for odometry and control , 2009, SAC '09.