Live demonstration: Neuromorphic event-based multi-kernel algorithm for high speed visual features tracking

This demo presents a method of visual tracking using the output of an event-based asynchronous neuromorphic event-based camera. The approach is event-based thus adapted to the scene-driven properties of these sensors. The method allows to track multiple visual features in real time at a frequency of several hundreds kilohertz. It adapts to scene contents, combining both spatial and temporal correlations of events in an asynchronous iterative framework. Various kernels are used to track features from incoming events such as Gaussian, Gabor, combinations of Gabor functions and any hand-made kernel with very weak constraints. The proposed features tracking method can deal with feature variations in position, scale and orientation. The tracking performance is evaluated experimentally for each kernel to prove the robustness of the proposed solution.

[1]  Daniel Matolin,et al.  An asynchronous time-based image sensor , 2008, 2008 IEEE International Symposium on Circuits and Systems.

[2]  Daniel Matolin,et al.  A QVGA 143 dB Dynamic Range Frame-Free PWM Image Sensor With Lossless Pixel-Level Video Compression and Time-Domain CDS , 2011, IEEE Journal of Solid-State Circuits.