Low-Rank Data Completion With Very Low Sampling Rate Using Newton's Method

Newton's method is a widely applicable and empirically efficient method for finding the solution to a set of equations. The recently developed algebraic geometry analyses provide information-theoretic bounds on the sampling rate to ensure the existence of a unique completion. A remained open question from these works is to retrieve the sampled data when the sampling rate is very close to the mentioned information-theoretic bounds. This paper is concerned with proposing algorithms to retrieve the sampled data when the sampling rate is too small and close to the mentioned information-theoretic bounds. Hence, we propose a new approach for recovering a partially sampled low-rank matrix or tensor when the number of samples is only slightly more than the dimension of the corresponding manifold, by solving a set of polynomial equations using Newton's method. In particular, we consider low-rank matrix completion, matrix sensing, and tensor completion. Each observed entry contributes one polynomial equation in terms of the factors in the rank factorization of the data. By exploiting the specific structures of the resulting set of polynomial equations, we analytically characterize the convergence regions of the Newton's method for matrix completion and matrix sensing. Through extensive numerical results, we show that the proposed approach outperforms the well-known methods such as nuclear norm minimization and alternating minimization in terms of the success rate of data recovery (noiseless case) and peak signal-to-noise ratio (noisy case), especially when the sampling rate is very low.

[1]  Kamyar Azizzadenesheli,et al.  Reinforcement Learning of POMDPs using Spectral Methods , 2016, COLT.

[2]  Xiaodong Wang,et al.  An approximation of the CP-rank of a partially sampled tensor , 2017, 2017 55th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[3]  M. Mesbahi,et al.  On the rank minimization problem , 2004, Proceedings of the 2004 American Control Conference.

[4]  Emmanuel J. Candès,et al.  A Singular Value Thresholding Algorithm for Matrix Completion , 2008, SIAM J. Optim..

[5]  Vaneet Aggarwal,et al.  Tensor Completion by Alternating Minimization under the Tensor Train (TT) Model , 2016, ArXiv.

[6]  Xiaodong Wang,et al.  Clustering a union of low-rank subspaces of different dimensions with missing data , 2019, Pattern Recognit. Lett..

[7]  Xiaodong Wang,et al.  Low-Tubal-Rank Tensor Completion Using Alternating Minimization , 2016, IEEE Transactions on Information Theory.

[8]  Zuowei Shen,et al.  Robust video denoising using low rank matrix completion , 2010, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[9]  Ryota Tomioka,et al.  Estimation of low-rank tensors via convex optimization , 2010, 1010.0789.

[10]  Daniel L. Pimentel-Alarcón A simpler approach to low-rank tensor canonical polyadic decomposition , 2016, 2016 54th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[11]  Junfeng Yang,et al.  Linearized augmented Lagrangian and alternating direction methods for nuclear norm minimization , 2012, Math. Comput..

[12]  Robert D. Nowak,et al.  Low algebraic dimension matrix completion , 2017, 2017 55th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[13]  Prateek Jain,et al.  Provable Tensor Factorization with Missing Data , 2014, NIPS.

[14]  Xiaodong Wang,et al.  A characterization of sampling patterns for low-tucker-rank tensor completion problem , 2017, 2017 IEEE International Symposium on Information Theory (ISIT).

[15]  Nathan Srebro,et al.  Fast maximum margin matrix factorization for collaborative prediction , 2005, ICML.

[16]  Pablo A. Parrilo,et al.  Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization , 2007, SIAM Rev..

[17]  Emmanuel J. Candès,et al.  Exact Matrix Completion via Convex Optimization , 2009, Found. Comput. Math..

[18]  B. Recht,et al.  Tensor completion and low-n-rank tensor recovery via convex optimization , 2011 .

[19]  Xiaodong Wang,et al.  Adaptive Sampling of RF Fingerprints for Fine-Grained Indoor Localization , 2015, IEEE Transactions on Mobile Computing.

[20]  Johan A. K. Suykens,et al.  Learning with tensors: a framework based on convex optimization and spectral regularization , 2014, Machine Learning.

[21]  Yonina C. Eldar,et al.  Phase Retrieval via Matrix Completion , 2011, SIAM Rev..

[22]  David R. Karger,et al.  Deterministic network coding by matrix completion , 2005, SODA '05.

[23]  Xiaodong Wang,et al.  Scaled Nuclear Norm Minimization for Low-Rank Tensor Completion , 2017, ArXiv.

[24]  Xiaodong Wang,et al.  Fundamental Conditions for Low-CP-Rank Tensor Completion , 2017, J. Mach. Learn. Res..

[25]  Bart Vandereycken,et al.  Low-rank tensor completion by Riemannian optimization , 2014 .

[26]  Carlos Guestrin,et al.  Compact Factorization of Matrices Using Generalized Round-Rank , 2018, ArXiv.

[27]  Xiaodong Wang,et al.  A characterization of sampling patterns for low-rank multi-view data completion problem , 2017, 2017 IEEE International Symposium on Information Theory (ISIT).

[28]  Sameer Singh,et al.  Embedding Multimodal Relational Data for Knowledge Base Completion , 2018, EMNLP.

[29]  Robert D. Nowak,et al.  The Information-Theoretic Requirements of Subspace Clustering with Missing Data , 2016, ICML.

[30]  Xiaodong Wang,et al.  On Deterministic Sampling Patterns for Robust Low-Rank Matrix Completion , 2017, IEEE Signal Processing Letters.

[31]  Emmanuel J. Candès,et al.  The Power of Convex Relaxation: Near-Optimal Matrix Completion , 2009, IEEE Transactions on Information Theory.

[32]  Prateek Jain,et al.  Low-rank matrix completion using alternating minimization , 2012, STOC '13.

[33]  Robert D. Nowak,et al.  Necessary and sufficient conditions for sketched subspace clustering , 2016, 2016 54th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[34]  Javad Lavaei,et al.  Characterization of rank-constrained feasibility problems via a finite number of convex programs , 2016, 2016 IEEE 55th Conference on Decision and Control (CDC).

[35]  Akshay Krishnamurthy,et al.  Low-Rank Matrix and Tensor Completion via Adaptive Sampling , 2013, NIPS.

[36]  Junfeng Yang,et al.  A New Alternating Minimization Algorithm for Total Variation Image Reconstruction , 2008, SIAM J. Imaging Sci..

[37]  Nigel Boston,et al.  Deterministic conditions for subspace identifiability from incomplete sampling , 2014, 2015 IEEE International Symposium on Information Theory (ISIT).

[38]  Robert D. Nowak,et al.  A characterization of deterministic sampling patterns for low-rank matrix completion , 2015, Allerton.

[39]  Xiaodong Wang,et al.  A Characterization of Sampling Patterns for Union of Low-Rank Subspaces Retrieval Problem , 2018, ISAIM.

[40]  Xiaodong Wang,et al.  Rank Determination for Low-Rank Data Completion , 2017, J. Mach. Learn. Res..