Robust local optical flow: Dense motion vector field interpolation

Optical flow methods integrating sparse point correspondences have made significant contribution in the field of optical flow estimation. Especially for the goal of estimating motion accurately and efficiently, sparse-to-dense interpolation schemes for feature point matches have shown outstanding performances. Concurrently, local optical flow methods have been significantly improved with respect to long-range motion estimation in environments with varying illumination. This motivates us to propose a sparse-to-dense approach based on the Robust Local Optical Flow method. Compared to state-of-the-art methods the proposed approach is significantly faster while retaining competitive accuracy on Middlebury, KITTI 2015 and MPI-Sintel data-set.

[1]  Julius Ziegler,et al.  StereoScan: Dense 3d reconstruction in real-time , 2011, 2011 IEEE Intelligent Vehicles Symposium (IV).

[2]  J.-Y. Bouguet,et al.  Pyramidal implementation of the lucas kanade feature tracker , 1999 .

[3]  Patrick Bouthemy,et al.  Optical flow modeling and computation: A survey , 2015, Comput. Vis. Image Underst..

[4]  S. Negahdaripour,et al.  Relaxing the Brightness Constancy Assumption in Computing Optical Flow , 1987 .

[5]  Patrick Pérez,et al.  Geodesic image and video editing , 2010, TOGS.

[6]  Andreas Geiger,et al.  Object scene flow for autonomous vehicles , 2015, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[7]  Tobias Senst,et al.  Robust local optical flow: Long-range motions and varying illuminations , 2016, 2016 IEEE International Conference on Image Processing (ICIP).

[8]  Cordelia Schmid,et al.  EpicFlow: Edge-preserving interpolation of correspondences for optical flow , 2015, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[9]  Tobias Senst,et al.  Cross based robust local optical flow , 2014, 2014 IEEE International Conference on Image Processing (ICIP).

[10]  Tobias Senst,et al.  Robust local optical flow estimation using bilinear equations for sparse motion estimation , 2013, 2013 IEEE International Conference on Image Processing.

[11]  Berthold K. P. Horn,et al.  Determining Optical Flow , 1981, Other Conferences.

[12]  Carlo Tomasi,et al.  Good features to track , 1994, 1994 Proceedings of IEEE Conference on Computer Vision and Pattern Recognition.

[13]  Michael J. Black,et al.  A Naturalistic Open Source Movie for Optical Flow Evaluation , 2012, ECCV.

[14]  Tobias Senst,et al.  Robust Local Optical Flow for Feature Tracking , 2012, IEEE Transactions on Circuits and Systems for Video Technology.

[15]  Richard Szeliski,et al.  A Database and Evaluation Methodology for Optical Flow , 2007, 2007 IEEE 11th International Conference on Computer Vision.

[16]  Takeo Kanade,et al.  An Iterative Image Registration Technique with an Application to Stereo Vision , 1981, IJCAI.

[17]  Michael J. Black,et al.  Efficient sparse-to-dense optical flow estimation using a learned basis and layers , 2015, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[18]  Cordelia Schmid,et al.  DeepFlow: Large Displacement Optical Flow with Deep Matching , 2013, 2013 IEEE International Conference on Computer Vision.

[19]  Thomas Brox,et al.  High Accuracy Optical Flow Estimation Based on a Theory for Warping , 2004, ECCV.

[20]  Joachim Weickert,et al.  Universität Des Saarlandes Fachrichtung 6.1 – Mathematik Optic Flow in Harmony Optic Flow in Harmony Optic Flow in Harmony , 2022 .

[21]  Michael J. Black,et al.  Layered segmentation and optical flow estimation over time , 2012, 2012 IEEE Conference on Computer Vision and Pattern Recognition.

[22]  Tom Drummond,et al.  Machine Learning for High-Speed Corner Detection , 2006, ECCV.