Rapid handling of outliers in dense sampling descriptor correspondence fields

Abstract. The sparse-to-dense approach is considered to be the standard method of capturing the long-range motion of small objects during large displacement optical flow. Despite progress in the matching and interpolation of this approach, little work has focused on improving the handling of outliers after dense sampling descriptor matching. We propose an improved grid-based statistical matching method that can quickly remove outliers without calculating backward flow. First, a multigrid statistical matching method is developed to remove the most outliers of the dense sampling descriptor correspondence field. Second, to improve the accuracy of outliers handling, the misjudgment match in the edge grid is corrected based on the statistical matching constraint. The results of extensive experiments on public optical flow datasets demonstrate the effectiveness of the proposed method.

[1]  Lior Wolf,et al.  InterpoNet, a Brain Inspired Neural Network for Optical Flow Dense Interpolation , 2016, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[2]  Antonio Torralba,et al.  SIFT Flow: Dense Correspondence across Scenes and Its Applications , 2011, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[3]  Cheng Lei,et al.  Optical flow estimation on coarse-to-fine region-trees using discrete optimization , 2009, 2009 IEEE 12th International Conference on Computer Vision.

[4]  Andrés Bruhn,et al.  Order-Adaptive and Illumination-Aware Variational Optical Flow Refinement , 2017, BMVC.

[5]  Jitendra Malik,et al.  Large Displacement Optical Flow: Descriptor Matching in Variational Motion Estimation , 2011, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[6]  Shai Avidan,et al.  Coherency Sensitive Hashing , 2011, ICCV.

[7]  Didier Stricker,et al.  Flow Fields: Dense Correspondence Fields for Highly Accurate Large Displacement Optical Flow Estimation , 2019, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[8]  Luc Van Gool,et al.  A Probabilistic Approach to Large Displacement Optical Flow and Occlusion Detection , 2004, ECCV Workshop SMVP.

[9]  Jitendra Malik,et al.  Large displacement optical flow , 2009, CVPR.

[10]  Cordelia Schmid,et al.  DeepFlow: Large Displacement Optical Flow with Deep Matching , 2013, 2013 IEEE International Conference on Computer Vision.

[11]  Yunsong Li,et al.  Robust Interpolation of Correspondences for Large Displacement Optical Flow , 2017, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[12]  Thomas Brox,et al.  High Accuracy Optical Flow Estimation Based on a Theory for Warping , 2004, ECCV.

[13]  Rachid Deriche,et al.  Symmetrical Dense Optical Flow Estimation with Occlusions Detection , 2002, ECCV.

[14]  David G. Lowe,et al.  Object recognition from local scale-invariant features , 1999, Proceedings of the Seventh IEEE International Conference on Computer Vision.

[15]  Luc Van Gool,et al.  Sparse Flow: Sparse Matching for Small to Large Displacement Optical Flow , 2015, 2015 IEEE Winter Conference on Applications of Computer Vision.

[16]  Michael J. Black,et al.  A Quantitative Analysis of Current Practices in Optical Flow Estimation and the Principles Behind Them , 2013, International Journal of Computer Vision.

[17]  Sharib Ali,et al.  Illumination invariant optical flow using neighborhood descriptors , 2016, Comput. Vis. Image Underst..

[18]  Yunsong Li,et al.  Efficient Coarse-to-Fine Patch Match for Large Displacement Optical Flow , 2016, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[19]  Cristian Sminchisescu,et al.  Locally Affine Sparse-to-Dense Matching for Motion and Occlusion Estimation , 2013, 2013 IEEE International Conference on Computer Vision.

[20]  Thomas Pock,et al.  Non-local Total Generalized Variation for Optical Flow Estimation , 2014, ECCV.

[21]  Richard Szeliski,et al.  A Database and Evaluation Methodology for Optical Flow , 2007, 2007 IEEE 11th International Conference on Computer Vision.

[22]  Jian Sun,et al.  Computing nearest-neighbor fields via Propagation-Assisted KD-Trees , 2012, 2012 IEEE Conference on Computer Vision and Pattern Recognition.

[23]  Yasuyuki Matsushita,et al.  GMS: Grid-Based Motion Statistics for Fast, Ultra-robust Feature Correspondence , 2017, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[24]  Ying Wu,et al.  Large Displacement Optical Flow from Nearest Neighbor Fields , 2013, 2013 IEEE Conference on Computer Vision and Pattern Recognition.

[25]  Berthold K. P. Horn,et al.  Determining Optical Flow , 1981, Other Conferences.

[26]  Yasuyuki Matsushita,et al.  Motion detail preserving optical flow estimation , 2010, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[27]  Michael J. Black,et al.  A Naturalistic Open Source Movie for Optical Flow Evaluation , 2012, ECCV.

[28]  Takeo Kanade,et al.  An Iterative Image Registration Technique with an Application to Stereo Vision , 1981, IJCAI.

[29]  Hailin Jin,et al.  Fast Edge-Preserving PatchMatch for Large Displacement Optical Flow , 2014, CVPR.

[30]  Eli Shechtman,et al.  PatchMatch: a randomized correspondence algorithm for structural image editing , 2009, ACM Trans. Graph..

[31]  Cordelia Schmid,et al.  EpicFlow: Edge-preserving interpolation of correspondences for optical flow , 2015, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[32]  Cordelia Schmid,et al.  DeepMatching: Hierarchical Deformable Dense Matching , 2015, International Journal of Computer Vision.

[33]  Andreas Geiger,et al.  Are we ready for autonomous driving? The KITTI vision benchmark suite , 2012, 2012 IEEE Conference on Computer Vision and Pattern Recognition.