Run Your Visual-Inertial Odometry on NVIDIA Jetson: Benchmark Tests on a Micro Aerial Vehicle

This letter presents benchmark tests of various visual(-inertial) odometry algorithms on NVIDIA Jetson platforms. The compared algorithms include mono and stereo, covering Visual Odometry (VO) and Visual-Inertial Odometry (VIO): VINS-Mono, VINS-Fusion, Kimera, ALVIO, Stereo-MSCKF, ORB-SLAM2 stereo, and ROVIO. As these methods are mainly used for unmanned aerial vehicles (UAVs), they must perform well in situations where the size of the processing board and weight is limited. Jetson boards released by NVIDIA satisfy these constraints as they have a sufficiently powerful central processing unit (CPU) and graphics processing unit (GPU) for image processing. However, in existing studies, the performance of Jetson boards as a processing platform for executing VO/VIO has not been compared extensively in terms of the usage of computing resources and accuracy. Therefore, this study compares representative VO/VIO algorithms on several NVIDIA Jetson platforms, namely NVIDIA Jetson TX2, Xavier NX, and AGX Xavier, and introduces a novel dataset ‘KAIST VIO dataset’ for UAVs. Including pure rotations, the dataset has several geometric trajectories that are harsh to visual(-inertial) state estimation. The evaluation is performed in terms of the accuracy of estimated odometry, CPU usage, and memory usage on various Jetson boards, algorithms, and trajectories. We present the results of the comprehensive benchmark test and release the dataset for the computer vision and robotics applications.

[1]  Jörg Stückler,et al.  The TUM VI Benchmark for Evaluating Visual-Inertial Odometry , 2018, 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[2]  Roland Siegwart,et al.  Robust Real-Time Visual Odometry with a Single Camera and an IMU , 2011, BMVC.

[3]  Deok-Hwan Kim,et al.  Benchmarking Jetson Platform for 3D Point-Cloud and Hyper-Spectral Image Classification , 2020, 2020 IEEE International Conference on Big Data and Smart Computing (BigComp).

[4]  Javier González,et al.  Robust stereo visual odometry through a probabilistic combination of points and line segments , 2016, 2016 IEEE International Conference on Robotics and Automation (ICRA).

[5]  Marco Pertile,et al.  An evaluation of ROS-compatible stereo visual SLAM methods on a nVidia Jetson TX2 , 2019, Measurement.

[6]  Daniel Cremers,et al.  Robust odometry estimation for RGB-D cameras , 2013, 2013 IEEE International Conference on Robotics and Automation.

[7]  Javier Gonzalez-Jimenez,et al.  The UMA-VI dataset: Visual–inertial odometry in low-textured and dynamic illumination environments , 2020, Int. J. Robotics Res..

[8]  Jörg Stückler,et al.  Deep Virtual Stereo Odometry: Leveraging Deep Depth Prediction for Monocular Direct Sparse Odometry , 2018, ECCV.

[9]  Michael Bosse,et al.  Keyframe-based visual–inertial odometry using nonlinear optimization , 2015, Int. J. Robotics Res..

[10]  Bernhard Rinner,et al.  Multi-UAV Surveillance With Minimum Information Idleness and Latency Constraints , 2020, IEEE Robotics and Automation Letters.

[11]  Davide Scaramuzza,et al.  SVO: Fast semi-direct monocular visual odometry , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[12]  Davide Scaramuzza,et al.  A Benchmark Comparison of Monocular Visual-Inertial Odometry Algorithms for Flying Robots , 2018, 2018 IEEE International Conference on Robotics and Automation (ICRA).

[13]  Hyun Myung,et al.  Image-Based Monitoring of Jellyfish Using Deep Learning Architecture , 2016, IEEE Sensors Journal.

[14]  Clark F. Olson,et al.  Stereo ego-motion improvements for robust rover navigation , 2001, Proceedings 2001 ICRA. IEEE International Conference on Robotics and Automation (Cat. No.01CH37164).

[15]  Hyun Myung,et al.  Development of Algal Bloom Removal System Using Unmanned Aerial Vehicle and Surface Vehicle , 2017, IEEE Access.

[16]  Luca Carlone,et al.  Kimera: an Open-Source Library for Real-Time Metric-Semantic Localization and Mapping , 2020, 2020 IEEE International Conference on Robotics and Automation (ICRA).

[17]  Kostas Daniilidis,et al.  PennCOSYVIO: A challenging Visual Inertial Odometry benchmark , 2017, 2017 IEEE International Conference on Robotics and Automation (ICRA).

[18]  Pilsung Kang,et al.  Benchmarking GPU-Accelerated Edge Devices , 2020, 2020 IEEE International Conference on Big Data and Smart Computing (BigComp).

[19]  Ian D. Reid,et al.  Unsupervised Learning of Monocular Depth Estimation and Visual Odometry with Deep Feature Reconstruction , 2018, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.

[20]  Hyukdoo Choi An Open-Source Benchmark for Scale-Aware Visual Odometry Algorithms , 2019, Int. J. Fuzzy Log. Intell. Syst..

[21]  Daniel Cremers,et al.  Challenges in Monocular Visual Odometry: Photometric Calibration, Motion Bias, and Rolling Shutter Effect , 2017, IEEE Robotics and Automation Letters.

[22]  John J. Leonard,et al.  Robust real-time visual odometry for dense RGB-D mapping , 2013, 2013 IEEE International Conference on Robotics and Automation.

[23]  Daniel Cremers,et al.  A Photometrically Calibrated Benchmark For Monocular Visual Odometry , 2016, ArXiv.

[24]  Shaojie Shen,et al.  A General Optimization-based Framework for Local Odometry Estimation with Multiple Sensors , 2019, ArXiv.

[25]  Hyun Myung,et al.  Bridge Inspection Using Unmanned Aerial Vehicle Based on HG-SLAM: Hierarchical Graph-Based SLAM , 2020, Remote. Sens..

[26]  Roland Siegwart,et al.  Unified temporal and spatial calibration for multi-sensor systems , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[27]  Juan D. Tardós,et al.  ORB-SLAM2: An Open-Source SLAM System for Monocular, Stereo, and RGB-D Cameras , 2016, IEEE Transactions on Robotics.

[28]  Vijay Kumar,et al.  Robust Stereo Visual Inertial Odometry for Fast Autonomous Flight , 2017, IEEE Robotics and Automation Letters.

[29]  Anastasios I. Mourikis,et al.  High-precision, consistent EKF-based visual-inertial odometry , 2013, Int. J. Robotics Res..

[30]  Hyun Myung,et al.  ALVIO: Adaptive Line and Point Feature-based Visual Inertial Odometry for Robust Localization in Indoor Environments , 2020, ArXiv.

[31]  Roland Siegwart,et al.  Iterated extended Kalman filter based visual-inertial odometry using direct photometric feedback , 2017, Int. J. Robotics Res..

[32]  Roland Siegwart,et al.  The EuRoC micro aerial vehicle datasets , 2016, Int. J. Robotics Res..

[33]  Christoffer Heckman,et al.  A Benchmark for Visual-Inertial Odometry Systems Employing Onboard Illumination , 2019, 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[34]  Andrew Howard,et al.  Real-time stereo visual odometry for autonomous ground vehicles , 2008, 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[35]  Hyun Myung,et al.  Multi-Layer Coverage Path Planner for Autonomous Structural Inspection of High-Rise Structures , 2018, 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[36]  Shaojie Shen,et al.  VINS-Mono: A Robust and Versatile Monocular Visual-Inertial State Estimator , 2017, IEEE Transactions on Robotics.

[37]  Ahmet Ali Süzen,et al.  Benchmark Analysis of Jetson TX2, Jetson Nano and Raspberry PI using Deep-CNN , 2020, 2020 International Congress on Human-Computer Interaction, Optimization and Robotic Applications (HORA).