Retinal Slip Estimation and Object Tracking with an Active Event Camera

This paper presents a retinal slip estimation algorithm and a novel tracking strategy for active event-based cameras. Event-based sensors have independent, asynchronous pixels that report local luminance change. Unlike conventional image sensors, the event cameras have a high dynamic range and high temporal resolution. To preserve these advantages, we propose an algorithm that enables estimates of the normal component of the optical flow to be updated as spikes come in, eschewing the accumulation of events within a fixed temporal window. We also propose a method for integrating these normal flow estimates to estimate the retinal slip between the camera and object. We demonstrate the advantages of this asynchronous estimator of the retinal slip using an active event camera simulator (AESIM), which we developed to enable simulation of the active control of an event sensor in a dynamic environment.

[1]  Bernabé Linares-Barranco,et al.  On algorithmic rate-coded AER generation , 2006, IEEE Transactions on Neural Networks.

[2]  Chiara Bartolozzi,et al.  Event-Based Visual Flow , 2014, IEEE Transactions on Neural Networks and Learning Systems.

[3]  Davide Scaramuzza,et al.  ESIM: an Open Event Camera Simulator , 2018, CoRL.

[4]  Shihao Zhang,et al.  Long-term object tracking with a moving event camera , 2018, BMVC.

[5]  Yulia Sandamirskaya,et al.  Event-Based Attention and Tracking on Neuromorphic Hardware , 2019, 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW).

[6]  Tobias Delbrück,et al.  Frame-free dynamic digital vision , 2008 .

[7]  Yiannis Aloimonos,et al.  Event-Based Moving Object Detection and Tracking , 2018, 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[8]  Tobi Delbrück,et al.  The event-camera dataset and simulator: Event-based data for pose estimation, visual odometry, and SLAM , 2016, Int. J. Robotics Res..

[9]  Chiara Bartolozzi,et al.  Asynchronous frameless event-based optical flow , 2012, Neural Networks.

[10]  Titus Cieslewski,et al.  Are We Ready for Autonomous Drone Racing? The UZH-FPV Drone Racing Dataset , 2019, 2019 International Conference on Robotics and Automation (ICRA).

[11]  Narciso García,et al.  Event-Based Vision Meets Deep Learning on Steering Prediction for Self-Driving Cars , 2018, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.

[12]  Steven S. Beauchemin,et al.  The computation of optical flow , 1995, CSUR.

[13]  Tobi Delbruck,et al.  Evaluation of Event-Based Algorithms for Optical Flow with Ground-Truth from Inertial Measurement Sensor , 2016, Front. Neurosci..