ESVIO: Event-Based Stereo Visual-Inertial Odometry

The emerging event cameras are bio-inspired sensors that can output pixel-level brightness changes at extremely high rates, and event-based visual-inertial odometry (VIO) is widely studied and used in autonomous robots. In this paper, we propose an event-based stereo VIO system, namely ESVIO. Firstly, we present a novel direct event-based VIO method, which fuses events’ depth, Time-Surface images, and pre-integrated inertial measurement to estimate the camera motion and inertial measurement unit (IMU) biases in a sliding window non-linear optimization framework, effectively improving the state estimation accuracy and robustness. Secondly, we design an event-inertia semi-joint initialization method, through two steps of event-only initialization and event-inertia initial optimization, to rapidly and accurately solve the initialization parameters of the VIO system, thereby further improving the state estimation accuracy. Based on these two methods, we implement the ESVIO system and evaluate the effectiveness and robustness of ESVIO on various public datasets. The experimental results show that ESVIO achieves good performance in both accuracy and robustness when compared with other state-of-the-art event-based VIO and stereo visual odometry (VO) systems, and, at the same time, with no compromise to real-time performance.

[1]  W. Guan,et al.  ESVIO: Event-Based Stereo Visual Inertial Odometry , 2022, IEEE Robotics and Automation Letters.

[2]  Guillermo Gallego,et al.  Multi‐Event‐Camera Depth Estimation and Outlier Rejection by Refocused Events Fusion , 2022, Adv. Intell. Syst..

[3]  Weibo Huang,et al.  Integrating Point and Line Features for Visual-Inertial Initialization , 2022, 2022 International Conference on Robotics and Automation (ICRA).

[4]  Sai-Kit Yeung,et al.  360VO: Visual Odometry Using A Single 360 Camera , 2022, 2022 International Conference on Robotics and Automation (ICRA).

[5]  Guillermo Gallego Bonet,et al.  Event-aided Direct Sparse Odometry , 2022, 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).

[6]  Daniel Cremers,et al.  DM-VIO: Delayed Marginalization Visual-Inertial Odometry , 2022, IEEE Robotics and Automation Letters.

[7]  Junaed Sattar,et al.  Continuous-Time Spline Visual-Inertial Odometry , 2021, 2022 International Conference on Robotics and Automation (ICRA).

[8]  Dianxi Shi,et al.  PLC-VIO: Visual–Inertial Odometry Based on Point-Line Constraints , 2021, IEEE Transactions on Automation Science and Engineering.

[9]  Peng Wang,et al.  Leveraging Structural Information to Improve Point Line Visual-Inertial Odometry , 2021, IEEE Robotics and Automation Letters.

[10]  Chiara Bartolozzi,et al.  Event-Based Vision: A Survey , 2019, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[11]  Yilong Zhu,et al.  Comparing Representations in Tracking for Event Camera-based SLAM , 2021, 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW).

[12]  Hong Liu,et al.  Optimization-Based Online Initialization and Calibration of Monocular Visual-Inertial Odometry Considering Spatial-Temporal Constraints , 2021, Sensors.

[13]  Davide Scaramuzza,et al.  DSEC: A Stereo Event Camera Dataset for Driving Scenarios , 2021, IEEE Robotics and Automation Letters.

[14]  Kuk-Jin Yoon,et al.  Learning to Reconstruct HDR Images from Events, with Applications to Depth and Flow Prediction , 2021, International Journal of Computer Vision.

[15]  Shaojie Shen,et al.  Event-Based Stereo Visual Odometry , 2020, IEEE Transactions on Robotics.

[16]  Carlos Campos,et al.  ORB-SLAM3: An Accurate Open-Source Library for Visual, Visual–Inertial, and Multimap SLAM , 2020, IEEE Transactions on Robotics.

[17]  Juan I. Nieto,et al.  IDOL: A Framework for IMU-DVS Odometry using Lines , 2020, 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[18]  Javier Civera,et al.  Information-Driven Direct RGB-D Odometry , 2020, 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).

[19]  Carlos Campos,et al.  Inertial-Only Optimization for Visual-Inertial Initialization , 2020, 2020 IEEE International Conference on Robotics and Automation (ICRA).

[20]  Jörg Stückler,et al.  Visual-Inertial Mapping With Non-Linear Factor Recovery , 2019, IEEE Robotics and Automation Letters.

[21]  Guoquan Huang,et al.  Visual-Inertial Odometry with Point and Line Features , 2019, 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[22]  Hujun Bao,et al.  Rapid and Robust Monocular Visual-Inertial Initialization with Gravity Estimation via Vertical Edges , 2019, 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[23]  Davide Scaramuzza,et al.  Event-based, Direct Camera Tracking from a Photometric 3D Map using Nonlinear Optimization , 2019, 2019 International Conference on Robotics and Automation (ICRA).

[24]  J. M. M. Montiel,et al.  Fast and Robust Initialization for Visual-Inertial SLAM , 2019, 2019 International Conference on Robotics and Automation (ICRA).

[25]  Guoquan Huang,et al.  Visual-Inertial Navigation: A Concise Review , 2019, 2019 International Conference on Robotics and Automation (ICRA).

[26]  Hongdong Li,et al.  Canny-VO: Visual Odometry With RGB-D Cameras Based on Geometric 3-D–2-D Edge Alignment , 2019, IEEE Transactions on Robotics.

[27]  Nikos G. Tsagarakis,et al.  Real-Time 6DOF Pose Relocalization for Event Cameras With Stacked Spatial LSTM Networks , 2017, 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW).

[28]  Javier Gonzalez-Jimenez,et al.  PL-SLAM: A Stereo SLAM System Through the Combination of Points and Line Segments , 2017, IEEE Transactions on Robotics.

[29]  Davide Scaramuzza,et al.  ESIM: an Open Event Camera Simulator , 2018, CoRL.

[30]  Yi Zhou,et al.  Semi-Dense 3D Reconstruction with a Stereo Event Camera , 2018, ECCV.

[31]  Hong Liu,et al.  Online Initialization and Automatic Camera-IMU Extrinsic Calibration for Monocular Visual-Inertial SLAM , 2018, 2018 IEEE International Conference on Robotics and Automation (ICRA).

[32]  Daniel Cremers,et al.  Direct Sparse Visual-Inertial Odometry Using Dynamic Marginalization , 2018, 2018 IEEE International Conference on Robotics and Automation (ICRA).

[33]  Ji Zhao,et al.  PL-VIO: Tightly-Coupled Monocular Visual–Inertial Odometry Using Point and Line Features , 2018, Sensors.

[34]  Vijay Kumar,et al.  The Multivehicle Stereo Event Camera Dataset: An Event Camera Dataset for 3D Perception , 2018, IEEE Robotics and Automation Letters.

[35]  Davide Scaramuzza,et al.  Ultimate SLAM? Combining Events, Images, and IMU for Robust Visual SLAM in HDR and High-Speed Scenarios , 2017, IEEE Robotics and Automation Letters.

[36]  Shaojie Shen,et al.  VINS-Mono: A Robust and Versatile Monocular Visual-Inertial State Estimator , 2017, IEEE Transactions on Robotics.

[37]  Davide Scaramuzza,et al.  Continuous-Time Visual-Inertial Odometry for Event Cameras , 2017, IEEE Transactions on Robotics.

[38]  Davide Scaramuzza,et al.  Real-time Visual-Inertial Odometry for Event Cameras using Keyframe-based Nonlinear Optimization , 2017, BMVC.

[39]  Shaojie Shen,et al.  Robust initialization of monocular visual-inertial estimation on aerial robots , 2017, 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[40]  Daniel Cremers,et al.  Stereo DSO: Large-Scale Direct Sparse Visual Odometry with Stereo Cameras , 2017, 2017 IEEE International Conference on Computer Vision (ICCV).

[41]  Kostas Daniilidis,et al.  Event-Based Visual Inertial Odometry , 2017, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[42]  Kevin Eckenhoff,et al.  Direct visual-inertial navigation with analytical preintegration , 2017, 2017 IEEE International Conference on Robotics and Automation (ICRA).

[43]  Davide Scaramuzza,et al.  EVO: A Geometric Approach to Event-Based 6-DOF Parallel Tracking and Mapping in Real Time , 2017, IEEE Robotics and Automation Letters.

[44]  Tobi Delbrück,et al.  The event-camera dataset and simulator: Event-based data for pose estimation, visual odometry, and SLAM , 2016, Int. J. Robotics Res..

[45]  Juan D. Tardós,et al.  Visual-Inertial Monocular SLAM With Map Reuse , 2016, IEEE Robotics and Automation Letters.

[46]  Joan Solà,et al.  Quaternion kinematics for the error-state Kalman filter , 2015, ArXiv.

[47]  Shaojie Shen,et al.  Monocular Visual–Inertial State Estimation With Online Initialization and Camera–IMU Extrinsic Calibration , 2017, IEEE Transactions on Automation Science and Engineering.

[48]  Flavio Fontana,et al.  Simultaneous State Initialization and Gyroscope Bias Calibration in Visual Inertial Aided Navigation , 2017, IEEE Robotics and Automation Letters.

[49]  I. Reid,et al.  Past, Present, and Future of Simultaneous Localization and Mapping: Toward the Robust-Perception Age , 2016, IEEE Transactions on Robotics.

[50]  Stefan Leutenegger,et al.  Real-Time 3D Reconstruction and 6-DoF Tracking with an Event Camera , 2016, ECCV.

[51]  Davide Scaramuzza,et al.  Low-latency visual odometry using event-based feature tracks , 2016, 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[52]  Jörg Stückler,et al.  Direct visual-inertial odometry with stereo cameras , 2016, 2016 IEEE International Conference on Robotics and Automation (ICRA).

[53]  Roland Siegwart,et al.  Robust visual inertial odometry using a direct EKF-based approach , 2015, 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[54]  Michael Bosse,et al.  Keyframe-based visual–inertial odometry using nonlinear optimization , 2015, Int. J. Robotics Res..

[55]  Davide Scaramuzza,et al.  Event-based, 6-DOF pose tracking for high-speed maneuvers , 2014, 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[56]  Roland Siegwart,et al.  A robust and modular multi-sensor fusion approach applied to MAV navigation , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[57]  Agostino Martinelli,et al.  Closed-Form Solution of Visual-Inertial Structure from Motion , 2013, International Journal of Computer Vision.

[58]  Roland Siegwart,et al.  Real-time onboard visual-inertial state estimation and self-calibration of MAVs in unknown environments , 2012, 2012 IEEE International Conference on Robotics and Automation.

[59]  Salah Sukkarieh,et al.  Visual-Inertial-Aided Navigation for High-Dynamic Motion in Built Environments Without Initial Conditions , 2012, IEEE Transactions on Robotics.

[60]  Cuneyt Akinlar,et al.  EDLines: A real-time line segment detector with a false detection control , 2011, Pattern Recognit. Lett..

[61]  Morgan Quigley,et al.  ROS: an open-source Robot Operating System , 2009, ICRA 2009.

[62]  Heiko Hirschmüller,et al.  Stereo Processing by Semiglobal Matching and Mutual Information , 2008, IEEE Trans. Pattern Anal. Mach. Intell..

[63]  Peter Corke,et al.  An Introduction to Inertial and Visual Sensing , 2007, Int. J. Robotics Res..

[64]  Stergios I. Roumeliotis,et al.  A Multi-State Constraint Kalman Filter for Vision-aided Inertial Navigation , 2007, Proceedings 2007 IEEE International Conference on Robotics and Automation.

[65]  S. Umeyama,et al.  Least-Squares Estimation of Transformation Parameters Between Two Point Patterns , 1991, IEEE Trans. Pattern Anal. Mach. Intell..

[66]  Keiichi Abe,et al.  Topological structural analysis of digitized binary images by border following , 1985, Comput. Vis. Graph. Image Process..

[67]  Frederick R. Forst,et al.  On robust estimation of the location parameter , 1980 .