Event-based feature tracking with probabilistic data association

Asynchronous event-based sensors present new challenges in basic robot vision problems like feature tracking. The few existing approaches rely on grouping events into models and computing optical flow after assigning future events to those models. Such a hard commitment in data association attenuates the optical flow quality and causes shorter flow tracks. In this paper, we introduce a novel soft data association modeled with probabilities. The association probabilities are computed in an intertwined EM scheme with the optical flow computation that maximizes the expectation (marginalization) over all associations. In addition, to enable longer tracks we compute the affine deformation with respect to the initial point and use the resulting residual as a measure of persistence. The computed optical flow enables a varying temporal integration different for every feature and sized inversely proportional to the length of the flow. We show results in egomotion and very fast vehicle sequences and we show the superiority over standard frame-based cameras.

[1]  Takeo Kanade,et al.  An Iterative Image Registration Technique with an Application to Stereo Vision , 1981, IJCAI.

[2]  Berthold K. P. Horn,et al.  Determining Optical Flow , 1981, Other Conferences.

[3]  Christopher G. Harris,et al.  A Combined Corner and Edge Detector , 1988, Alvey Vision Conference.

[4]  Carlo Tomasi,et al.  Good features to track , 1994, 1994 Proceedings of IEEE Conference on Computer Vision and Pattern Recognition.

[5]  Dorin Comaniciu,et al.  Real-time tracking of non-rigid objects using mean shift , 2000, Proceedings IEEE Conference on Computer Vision and Pattern Recognition. CVPR 2000 (Cat. No.PR00662).

[6]  Xavier Pennec,et al.  Multi-scale EM-ICP: A Fast and Robust Approach for Surface Registration , 2002, ECCV.

[7]  T. Delbruck,et al.  A 128 128 120 dB 15 s Latency Asynchronous Temporal Contrast Vision Sensor , 2006 .

[8]  A.N. Belbachir,et al.  Embedded Vision System for Real-Time Object Tracking using an Asynchronous Transient Vision Sensor , 2006, 2006 IEEE 12th Digital Signal Processing Workshop & 4th IEEE Signal Processing Education Workshop.

[9]  Tobi Delbrück,et al.  A 128$\times$ 128 120 dB 15 $\mu$s Latency Asynchronous Temporal Contrast Vision Sensor , 2008, IEEE Journal of Solid-State Circuits.

[10]  Chiara Bartolozzi,et al.  Event-Based Visual Flow , 2014, IEEE Transactions on Neural Networks and Learning Systems.

[11]  Tobi Delbruck,et al.  A 240 × 180 130 dB 3 µs Latency Global Shutter Spatiotemporal Vision Sensor , 2014, IEEE Journal of Solid-State Circuits.

[12]  Tobias Brosch,et al.  On event-based motion detection and integration , 2014, BICT.

[13]  Ryad Benosman,et al.  Simultaneous Mosaicing and Tracking with an Event Camera , 2014, BMVC.

[14]  Davide Scaramuzza,et al.  Lifetime estimation of events from Dynamic Vision Sensors , 2015, 2015 IEEE International Conference on Robotics and Automation (ICRA).

[15]  Ryad Benosman,et al.  Asynchronous Event-Based Multikernel Algorithm for High-Speed Visual Features Tracking , 2015, IEEE Transactions on Neural Networks and Learning Systems.

[16]  Ryad Benosman,et al.  Visual Tracking Using Neuromorphic Asynchronous Event-Based Cameras , 2015, Neural Computation.

[17]  Davide Scaramuzza,et al.  Feature detection and tracking with the dynamic and active-pixel vision sensor (DAVIS) , 2016, 2016 Second International Conference on Event-based Control, Communication, and Signal Processing (EBCCSP).

[18]  Davide Scaramuzza,et al.  Low-latency visual odometry using event-based feature tracks , 2016, 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[19]  Yiannis Aloimonos,et al.  A Dataset for Visual Navigation with Neuromorphic Methods , 2016, Front. Neurosci..