The Blackbird UAV dataset

This article describes the Blackbird unmanned aerial vehicle (UAV) Dataset, a large-scale suite of sensor data and corresponding ground truth from a custom-built quadrotor platform equipped with an inertial measurement unit (IMU), rotor tachometers, and virtual color, grayscale, and depth cameras. Motivated by the increasing demand for agile, autonomous operation of aerial vehicles, this dataset is designed to facilitate the development and evaluation of high-performance UAV perception algorithms. The dataset contains over 10 hours of data from our quadrotor tracing 18 different trajectories at varying maximum speeds (0.5 to 13.8 ms-1) through 5 different visual environments for a total of 176 unique flights. For each flight, we provide 120 Hz grayscale, 60 Hz RGB-D, and 60 Hz semantically segmented images from forward stereo and downward-facing photorealistic virtual cameras in addition to 100 Hz IMU, ~190 Hz motor speed sensors, and 360 Hz millimeter-accurate motion capture ground truth. The Blackbird UAV dataset is therefore well suited to the development of algorithms for visual inertial navigation, 3D reconstruction, and depth estimation. As a benchmark for future algorithms, the performance of two state-of-the-art visual odometry algorithms are reported and scripts for comparing against the benchmarks are included with the dataset. The dataset is available for download at http://blackbird-dataset.mit.edu/.

[1]  Sertac Karaman,et al.  The Blackbird Dataset: A large-scale dataset for UAV perception in aggressive flight , 2018, ISER.

[2]  Andrew Howard,et al.  Design and use paradigms for Gazebo, an open-source multi-robot simulator , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[3]  Qiao Wang,et al.  VirtualWorlds as Proxy for Multi-object Tracking Analysis , 2016, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[4]  Min Bai,et al.  TorontoCity: Seeing the World with a Million Eyes , 2016, 2017 IEEE International Conference on Computer Vision (ICCV).

[5]  Vijay Kumar,et al.  Robust Stereo Visual Inertial Odometry for Fast Autonomous Flight , 2017, IEEE Robotics and Automation Letters.

[6]  Andrew J. Davison,et al.  A benchmark for RGB-D visual odometry, 3D reconstruction and SLAM , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[7]  Antonio M. López,et al.  The SYNTHIA Dataset: A Large Collection of Synthetic Images for Semantic Segmentation of Urban Scenes , 2016, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[8]  Titus Cieslewski,et al.  Are We Ready for Autonomous Drone Racing? The UZH-FPV Drone Racing Dataset , 2019, 2019 International Conference on Robotics and Automation (ICRA).

[9]  Luca Carlone,et al.  Visual-Inertial Navigation Algorithm Development Using Photorealistic Camera Simulation in the Loop , 2018, 2018 IEEE International Conference on Robotics and Automation (ICRA).

[10]  Juan D. Tardós,et al.  ORB-SLAM2: An Open-Source SLAM System for Monocular, Stereo, and RGB-D Cameras , 2016, IEEE Transactions on Robotics.

[11]  Shaojie Shen,et al.  A General Optimization-based Framework for Local Odometry Estimation with Multiple Sensors , 2019, ArXiv.

[12]  Vijay Kumar,et al.  Fast, autonomous flight in GPS‐denied and cluttered environments , 2017, J. Field Robotics.

[13]  Charles Richter,et al.  Polynomial Trajectory Planning for Aggressive Quadrotor Flight in Dense Indoor Environments , 2016, ISRR.

[14]  Edwin Olson,et al.  LCM: Lightweight Communications and Marshalling , 2010, 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[15]  Davide Scaramuzza,et al.  Aggressive quadrotor flight through narrow gaps with onboard sensing and computing using active vision , 2016, 2017 IEEE International Conference on Robotics and Automation (ICRA).

[16]  Vladlen Koltun,et al.  Beauty and the Beast: Optimal Methods Meet Learning for Drone Racing , 2018, 2019 International Conference on Robotics and Automation (ICRA).

[17]  J. M. M. Montiel,et al.  ORB-SLAM: A Versatile and Accurate Monocular SLAM System , 2015, IEEE Transactions on Robotics.

[18]  Roland Siegwart,et al.  RotorS—A Modular Gazebo MAV Simulator Framework , 2016 .

[19]  Amado Antonini Pre-integrated dynamics factors and a dynamical agile visual-inertial dataset for UAV perception , 2018 .

[20]  Kurt Akeley,et al.  Reality Engine graphics , 1993, SIGGRAPH.

[21]  Ashish Kapoor,et al.  AirSim: High-Fidelity Visual and Physical Simulation for Autonomous Vehicles , 2017, FSR.

[22]  Roland Siegwart,et al.  The EuRoC micro aerial vehicle datasets , 2016, Int. J. Robotics Res..

[23]  Norman I. Badler,et al.  Temporal anti-aliasing in computer generated animation , 1983, SIGGRAPH.

[24]  Roland Siegwart,et al.  Real-time visual-inertial mapping, re-localization and planning onboard MAVs in unknown environments , 2015, 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[25]  Davide Scaramuzza,et al.  VIMO: Simultaneous Visual Inertial Model-Based Odometry and Force Estimation , 2019, IEEE Robotics and Automation Letters.

[26]  Sertac Karaman,et al.  Accurate Tracking of Aggressive Quadrotor Trajectories Using Incremental Nonlinear Dynamic Inversion and Differential Flatness , 2018, 2018 IEEE Conference on Decision and Control (CDC).

[27]  Davide Scaramuzza,et al.  Ultimate SLAM? Combining Events, Images, and IMU for Robust Visual SLAM in HDR and High-Speed Scenarios , 2017, IEEE Robotics and Automation Letters.

[28]  Davide Scaramuzza,et al.  The Zurich urban micro aerial vehicle dataset , 2017, Int. J. Robotics Res..

[29]  Sertac Karaman,et al.  FlightGoggles: Photorealistic Sensor Simulation for Perception-driven Robotics using Photogrammetry and Virtual Reality , 2019, 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[30]  Davide Scaramuzza,et al.  Continuous-Time Visual-Inertial Odometry for Event Cameras , 2017, IEEE Transactions on Robotics.

[31]  Jason Lawrence,et al.  Amortized supersampling , 2009, ACM Trans. Graph..

[32]  Andreas Geiger,et al.  Are we ready for autonomous driving? The KITTI vision benchmark suite , 2012, 2012 IEEE Conference on Computer Vision and Pattern Recognition.

[33]  Erwin Coumans,et al.  Bullet physics simulation , 2015, SIGGRAPH Courses.

[34]  Roland Siegwart,et al.  Unified temporal and spatial calibration for multi-sensor systems , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[35]  A. Savitzky,et al.  Smoothing and Differentiation of Data by Simplified Least Squares Procedures. , 1964 .