A Commute in Data: The comma2k19 Dataset

comma.ai presents comma2k19, a dataset of over 33 hours of commute in California's 280 highway. This means 2019 segments, 1 minute long each, on a 20km section of highway driving between California's San Jose and San Francisco. The dataset was collected using comma EONs that have sensors similar to those of any modern smartphone including a road-facing camera, phone GPS, thermometers and a 9-axis IMU. Additionally, the EON captures raw GNSS measurements and all CAN data sent by the car with a comma grey panda. Laika, an open-source GNSS processing library, is also introduced here. Laika produces 40% more accurate positions than the GNSS module used to collect the raw data. This dataset includes pose (position + orientation) estimates in a global reference frame of the recording camera. These poses were computed with a tightly coupled INS/GNSS/Vision optimizer that relies on data processed by Laika. comma2k19 is ideal for development and validation of tightly coupled GNSS algorithms and mapping algorithms that work with commodity sensors.

[1]  Carlo Tomasi,et al.  Good features to track , 1994, 1994 Proceedings of IEEE Conference on Computer Vision and Pattern Recognition.

[2]  Yifeng Zhou,et al.  Sensor alignment with Earth-centered Earth-fixed (ECEF) coordinate system , 1999 .

[3]  R. Langley Dilution of Precision , 1999 .

[4]  Denis Pomorski,et al.  GPS/IMU data fusion using multisensor Kalman filtering: introduction of contextual aspects , 2006, Inf. Fusion.

[5]  R. Benjamin Harris,et al.  The GPSTk: an open source GPS toolkit , 2007 .

[6]  Stergios I. Roumeliotis,et al.  A Multi-State Constraint Kalman Filter for Vision-aided Inertial Navigation , 2007, Proceedings 2007 IEEE International Conference on Robotics and Automation.

[7]  Dieter Schmalstieg,et al.  Global pose estimation using multi-sensor fusion for outdoor Augmented Reality , 2009, 2009 8th IEEE International Symposium on Mixed and Augmented Reality.

[8]  N. Aouf,et al.  Robust INS/GPS Sensor Fusion for UAV Localization Using SDRE Nonlinear Filtering , 2010, IEEE Sensors Journal.

[9]  Tong Heng Lee,et al.  Unmanned Rotorcraft Systems , 2011 .

[10]  Gary R. Bradski,et al.  ORB: An efficient alternative to SIFT or SURF , 2011, 2011 International Conference on Computer Vision.

[11]  Sanjiv Singh,et al.  Global pose estimation with limited GPS and long range visual odometry , 2012, 2012 IEEE International Conference on Robotics and Automation.

[12]  Anastasios I. Mourikis,et al.  High-precision, consistent EKF-based visual-inertial odometry , 2013, Int. J. Robotics Res..

[13]  Andreas Geiger,et al.  Vision meets robotics: The KITTI dataset , 2013, Int. J. Robotics Res..

[14]  Markus Schreiber,et al.  Vehicle localization with tightly coupled GNSS and visual odometry , 2016, 2016 IEEE Intelligent Vehicles Symposium (IV).

[15]  Sebastian Ramos,et al.  The Cityscapes Dataset for Semantic Urban Scene Understanding , 2016, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[16]  Eder Santana,et al.  Learning a Driving Simulator , 2016, ArXiv.

[17]  Marco Pini,et al.  Loose and Tight GNSS/INS Integrations: Comparison of Performance Assessed in Real Urban Scenarios , 2017, Sensors.

[18]  Paul Newman,et al.  1 year, 1000 km: The Oxford RobotCar dataset , 2017, Int. J. Robotics Res..

[19]  Ruigang Yang,et al.  The ApolloScape Dataset for Autonomous Driving , 2018, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW).

[20]  Trevor Darrell,et al.  BDD100K: A Diverse Driving Video Database with Scalable Annotation Tooling , 2018, ArXiv.

[21]  Trevor Darrell,et al.  BDD100K: A Diverse Driving Video Database with Scalable Annotation Tooling , 2018, ArXiv.