The Event-Camera Dataset: Event-based Data for Pose Estimation, Visual Odometry, and SLAM

New vision sensors, such as the Dynamic and Active-pixel Vision sensor (DAVIS), incorporate a conventional globalshutter camera and an event-based sensor in the same pixel array. These sensors have great potential for high-speed robotics and computer vision because they allow us to combine the benefits of conventional cameras with those of event-based sensors: low latency, high temporal resolution, and very high dynamic range. However, new algorithms are required to exploit the sensor characteristics and cope with its unconventional output, which consists of a stream of asynchronous brightness changes (called events) and synchronous grayscale frames. For this purpose, we present and release a collection of datasets captured with a DAVIS in a variety of synthetic and real environments, which we hope will motivate research on new algorithms for high-speed and high-dynamic-range robotics and computervision applications. In addition to global-shutter intensity images and asynchronous events, we also provide inertial measurements and ground truth from a motion-capture system. All the data are released both as standard text files and binary files (i.e., rosbag). This paper provides an overview of the available data.

[1]  Stefan Leutenegger,et al.  Real-Time 3D Reconstruction and 6-DoF Tracking with an Event Camera , 2016, ECCV.

[2]  Tobi Delbruck,et al.  A 240 × 180 130 dB 3 µs Latency Global Shutter Spatiotemporal Vision Sensor , 2014, IEEE Journal of Solid-State Circuits.

[3]  Davide Scaramuzza,et al.  Low-latency event-based visual odometry , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[4]  Tobi Delbruck,et al.  Evaluation of Event-Based Algorithms for Optical Flow with Ground-Truth from Inertial Measurement Sensor , 2016, Front. Neurosci..

[5]  Matthew Cook,et al.  Interacting maps for fast visual interpretation , 2011, The 2011 International Joint Conference on Neural Networks.

[6]  Yiannis Aloimonos,et al.  A Dataset for Visual Navigation with Neuromorphic Methods , 2016, Front. Neurosci..

[7]  Chiara Bartolozzi,et al.  Event-Based Visual Flow , 2014, IEEE Transactions on Neural Networks and Learning Systems.

[8]  Roger Y. Tsai,et al.  A new technique for fully autonomous and efficient 3D robotics hand/eye calibration , 1988, IEEE Trans. Robotics Autom..

[9]  Jörg Conradt,et al.  Simultaneous Localization and Mapping for Event-Based Vision Systems , 2013, ICVS.

[10]  Ryad Benosman,et al.  Simultaneous Mosaicing and Tracking with an Event Camera , 2014, BMVC.

[11]  Tobi Delbrück,et al.  Integration of dynamic vision sensor with inertial measurement unit for electronically stabilized event-based vision , 2014, 2014 IEEE International Symposium on Circuits and Systems (ISCAS).

[12]  Davide Scaramuzza,et al.  Event-based, 6-DOF pose tracking for high-speed maneuvers , 2014, 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[13]  Davide Scaramuzza,et al.  Continuous-Time Trajectory Estimation for Event-based Vision Sensors , 2015, Robotics: Science and Systems.

[14]  Stefan Leutenegger,et al.  Simultaneous Optical Flow and Intensity Estimation from an Event Camera , 2016, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).