Towards Visual SLAM with Event-based Cameras

Event-based cameras (Figure 1) offer much potential to the fields of robotics and computer vision, in part due to their large dynamic range and extremely high “frame rates”. These attributes make them, at least in theory, particularly suitable for enabling tasks like navigation and mapping on high speed agile robotic platforms under challenging lighting conditions, a task which has been particularly challenging for traditional algorithms and camera sensors. Before these tasks become feasible however, progress must be made towards adapting and innovating current RGB-camera-based algorithms to work with event-based cameras. In this paper we present ongoing work towards this goal and an initial milestone – the development of a constrained visual SLAM system that can create semi-metric, topologically correct maps of a 2.7 km traverse through a large environment at real-time speed (Figure 2). Although much more sophistication is yet to be built into the system, we hope this work serves as a baseline for future research using these novel sensors.