A synchronized visual-inertial sensor system with FPGA pre-processing for accurate real-time SLAM

Robust, accurate pose estimation and mapping at real-time in six dimensions is a primary need of mobile robots, in particular flying Micro Aerial Vehicles (MAVs), which still perform their impressive maneuvers mostly in controlled environments. This work presents a visual-inertial sensor unit aimed at effortless deployment on robots in order to equip them with robust real-time Simultaneous Localization and Mapping (SLAM) capabilities, and to facilitate research on this important topic at a low entry barrier. Up to four cameras are interfaced through a modern ARMFPGA system, along with an Inertial Measurement Unit (IMU) providing high-quality rate gyro and accelerometer measurements, calibrated and hardware-synchronized with the images. This facilitates a tight fusion of visual and inertial cues that leads to a level of robustness and accuracy which is difficult to achieve with purely visual SLAM systems. In addition to raw data, the sensor head provides FPGA-pre-processed data such as visual keypoints, reducing the computational complexity of SLAM algorithms significantly and enabling employment on resource-constrained platforms. Sensor selection, hardware and firmware design, as well as intrinsic and extrinsic calibration are addressed in this work. Results from a tightly coupled reference visual-inertial SLAM framework demonstrate the capabilities of the presented system.

[1]  Christopher G. Harris,et al.  A Combined Corner and Edge Detector , 1988, Alvey Vision Conference.

[2]  Duane C. Brown,et al.  Close-Range Camera Calibration , 1971 .

[3]  Gaurav S. Sukhatme,et al.  A General Framework for Temporal Calibration of Multiple Proprioceptive and Exteroceptive Sensors , 2010, ISER.

[4]  Kurt Konolige,et al.  Double window optimisation for constant time visual SLAM , 2011, 2011 International Conference on Computer Vision.

[5]  G. Klein,et al.  Parallel Tracking and Mapping for Small AR Workspaces , 2007, 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality.

[6]  Hugh F. Durrant-Whyte,et al.  Simultaneous localization and mapping: part I , 2006, IEEE Robotics & Automation Magazine.

[7]  Paul Timothy Furgale,et al.  Continuous-time batch estimation using temporal basis functions , 2012, 2012 IEEE International Conference on Robotics and Automation.

[8]  Lars Asplund,et al.  GIMME - A General Image Multiview Manipulation Engine , 2011, 2011 International Conference on Reconfigurable Computing and FPGAs.

[9]  Heiko Hirschmüller,et al.  Stereo vision and IMU based real-time ego-motion and depth image computation on a handheld device , 2013, 2013 IEEE International Conference on Robotics and Automation.

[10]  Roland Siegwart,et al.  Unified temporal and spatial calibration for multi-sensor systems , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[11]  Roland Siegwart,et al.  Keyframe-Based Visual-Inertial SLAM using Nonlinear Optimization , 2013, Robotics: Science and Systems.

[12]  R. Siegwart,et al.  A UAV system for inspection of industrial facilities , 2013, 2013 IEEE Aerospace Conference.

[13]  Darius Burschka,et al.  Optimization based IMU camera calibration , 2011, 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[14]  Gaurav S. Sukhatme,et al.  Fast Relative Pose Calibration for Visual and Inertial Sensors , 2008, ISER.

[15]  Tom Drummond,et al.  Machine Learning for High-Speed Corner Detection , 2006, ECCV.

[16]  Yuanxin Wu,et al.  On 'A Kalman Filter-Based Algorithm for IMU-Camera Calibration: Observability Analysis and Performance Evaluation' , 2013, ArXiv.