Ultra-fast tracking based on zero-shift points

A novel tracker based on points where the intensity function is locally even is presented. Tracking of these so called zero-shift points (ZSPs) is very efficient, a single point is tracked on average in less than 10 microseconds on a standard notebook. We demonstrate experimentally the robustness of the tracker to image transformations and a relatively long lifetime of ZSPs in real videosequences.

[1]  Michel Dhome,et al.  Real Time Robust Template Matching , 2002, BMVC.

[2]  G. Klein,et al.  Parallel Tracking and Mapping for Small AR Workspaces , 2007, 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality.

[3]  Carlo Tomasi,et al.  Good features to track , 1994, 1994 Proceedings of IEEE Conference on Computer Vision and Pattern Recognition.

[4]  David W. Murray,et al.  Video-rate localization in multiple maps for wearable augmented reality , 2008, 2008 12th IEEE International Symposium on Wearable Computers.

[5]  Jiri Matas,et al.  Tracking by an Optimal Sequence of Linear Predictors , 2009, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[6]  Jiri Matas,et al.  P-N learning: Bootstrapping binary classifiers by structural constraints , 2010, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[7]  Václav Hlavác,et al.  Stable Wave Detector of Blobs in Images , 2006, DAGM-Symposium.

[8]  Nassir Navab,et al.  A dataset and evaluation methodology for template-based tracking algorithms , 2009, 2009 8th IEEE International Symposium on Mixed and Augmented Reality.

[9]  Christopher G. Harris,et al.  A Combined Corner and Edge Detector , 1988, Alvey Vision Conference.

[10]  Takeo Kanade,et al.  An Iterative Image Registration Technique with an Application to Stereo Vision , 1981, IJCAI.

[11]  Timothy F. Cootes,et al.  Active Appearance Models , 2001, IEEE Trans. Pattern Anal. Mach. Intell..