Polynomial Matrix Completion for Missing Data Imputation and Transductive Learning

This paper develops new methods to recover the missing entries of a high-rank or even full-rank matrix when the intrinsic dimension of the data is low compared to the ambient dimension. Specifically, we assume that the columns of a matrix are generated by polynomials acting on a low-dimensional intrinsic variable, and wish to recover the missing entries under this assumption. We show that we can identify the complete matrix of minimum intrinsic dimension by minimizing the rank of the matrix in a high dimensional feature space. We develop a new formulation of the resulting problem using the kernel trick together with a new relaxation of the rank objective, and propose an efficient optimization method. We also show how to use our methods to complete data drawn from multiple nonlinear manifolds. Comparative studies on synthetic data, subspace clustering with missing data, motion capture data recovery, and transductive learning verify the superiority of our methods over the state-of-the-art.

[1]  Madeleine Udell,et al.  Why Are Big Data Matrices Approximately Low Rank? , 2017, SIAM J. Math. Data Sci..

[2]  John Wright,et al.  Scalable Robust Matrix Recovery: Frank-Wolfe Meets Proximal Methods , 2014, SIAM J. Sci. Comput..

[3]  Stephen P. Boyd,et al.  Log-det heuristic for matrix rank minimization with applications to Hankel and Euclidean distance matrices , 2003, Proceedings of the 2003 American Control Conference, 2003..

[4]  Taghi M. Khoshgoftaar,et al.  A Survey of Collaborative Filtering Techniques , 2009, Adv. Artif. Intell..

[5]  Feiping Nie,et al.  Joint Schatten $$p$$p-norm and $$\ell _p$$ℓp-norm robust matrix completion for missing value recovery , 2013, Knowledge and Information Systems.

[6]  Yan Liu,et al.  Weighted Schatten $p$ -Norm Minimization for Image Denoising and Background Subtraction , 2015, IEEE Transactions on Image Processing.

[7]  Yin Zhang,et al.  Solving a low-rank factorization model for matrix completion by a nonlinear successive over-relaxation algorithm , 2012, Mathematical Programming Computation.

[8]  Daniel P. Robinson,et al.  Sparse Subspace Clustering with Missing Entries , 2015, ICML.

[9]  Geoffrey E. Hinton,et al.  Reducing the Dimensionality of Data with Neural Networks , 2006, Science.

[10]  Jicong Fan,et al.  Matrix completion by least-square, low-rank, and sparse self-representations , 2017, Pattern Recognit..

[11]  Roberto Tron RenVidal A Benchmark for the Comparison of 3-D Motion Segmentation Algorithms , 2007 .

[12]  Ehsan Elhamifar,et al.  High-Rank Matrix Completion and Clustering under Self-Expressive Models , 2016, NIPS.

[13]  Zhi-Quan Luo,et al.  Guaranteed Matrix Completion via Non-Convex Factorization , 2014, IEEE Transactions on Information Theory.

[14]  Eric O. Postma,et al.  Dimensionality Reduction: A Comparative Review , 2008 .

[15]  Edmund Y. Lam,et al.  Computationally Efficient Truncated Nuclear Norm Minimization for High Dynamic Range Imaging , 2016, IEEE Transactions on Image Processing.

[16]  Jicong Fan,et al.  Matrix completion by deep matrix factorization , 2018, Neural Networks.

[17]  Emmanuel J. Candès,et al.  Exact Matrix Completion via Convex Optimization , 2009, Found. Comput. Math..

[18]  Robert D. Nowak,et al.  Tensor Methods for Nonlinear Matrix Completion , 2018, SIAM J. Math. Data Sci..

[19]  R. Vidal,et al.  Sparse Subspace Clustering: Algorithm, Theory, and Applications. , 2013, IEEE transactions on pattern analysis and machine intelligence.

[20]  Nathan Halko,et al.  Finding Structure with Randomness: Probabilistic Algorithms for Constructing Approximate Matrix Decompositions , 2009, SIAM Rev..

[21]  S T Roweis,et al.  Nonlinear dimensionality reduction by locally linear embedding. , 2000, Science.

[22]  Robert D. Nowak,et al.  High-Rank Matrix Completion and Subspace Clustering with Missing Data , 2011, ArXiv.

[23]  Jieping Ye,et al.  Rank-One Matrix Pursuit for Matrix Completion , 2014, ICML.

[24]  Feiping Nie,et al.  Low-Rank Matrix Recovery via Efficient Schatten p-Norm Minimization , 2012, AAAI.

[25]  Robert D. Nowak,et al.  Transduction with Matrix Completion: Three Birds with One Stone , 2010, NIPS.

[26]  René Vidal,et al.  A Structured Sparse Plus Structured Low-Rank Framework for Subspace Clustering and Completion , 2016, IEEE Transactions on Signal Processing.

[27]  Christine Guillemot,et al.  Image Inpainting : Overview and Recent Advances , 2014, IEEE Signal Processing Magazine.

[28]  Shuicheng Yan,et al.  Generalized Nonconvex Nonsmooth Low-Rank Minimization , 2014, 2014 IEEE Conference on Computer Vision and Pattern Recognition.

[29]  Lei Zhang,et al.  Weighted Nuclear Norm Minimization with Application to Image Denoising , 2014, 2014 IEEE Conference on Computer Vision and Pattern Recognition.

[30]  Nicu Sebe,et al.  Recognizing Emotions from Abstract Paintings Using Non-Linear Matrix Completion , 2016, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[31]  Jicong Fan,et al.  Matrix Completion via Sparse Factorization Solved by Accelerated Proximal Alternating Linearized Minimization , 2020, IEEE Transactions on Big Data.

[32]  Xuelong Li,et al.  Fast and Accurate Matrix Completion via Truncated Nuclear Norm Regularization , 2013, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[33]  Bernhard Schölkopf,et al.  Nonlinear Component Analysis as a Kernel Eigenvalue Problem , 1998, Neural Computation.

[34]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[35]  Robert D. Nowak,et al.  Algebraic Variety Models for High-Rank Matrix Completion , 2017, ICML.