Atom Decomposition with Adaptive Basis Selection Strategy for Matrix Completion

Estimating missing entries in matrices has attracted much attention due to its wide range of applications like image inpainting and video denoising, which are usually considered as low-rank matrix completion problems theoretically. It is common to consider nuclear norm as a surrogate of the rank operator since it is the tightest convex lower bound of the rank operator under certain conditions. However, most approaches based on nuclear norm minimization involve a number of singular value decomposition (SVD) operations. Given a matrix X ∈ Rm×n, the time complexity of the SVD operation is O(mn2), which brings prohibitive computational burden on large-scale matrices, limiting the further usage of these methods in real applications. Motivated by this observation, a series of atom-decomposition-based matrix completion methods have been studied. The key to these methods is to reconstruct the target matrix by pursuit methods in a greedy way, which only involves the computation of the top SVD and has great advantages in efficiency compared with the SVD-based matrix completion methods. However, due to gradually serious accumulation errors, atom-decomposition-based methods usually result in unsatisfactory reconstruction accuracy. In this article, we propose a new efficient and scalable atom decomposition algorithm for matrix completion called Adaptive Basis Selection Strategy (ABSS). Different from traditional greedy atom decomposition methods, a two-phase strategy is conducted to generate the basis separately via different strategies according to their different nature. At first, we globally prune the basis space to eliminate the unimportant basis as much as possible and locate the probable subspace containing the most informative basis. Then, another group of basis spaces are learned to improve the recovery accuracy based on local information. In this way, our proposed algorithm breaks through the accuracy bottleneck of traditional atom-decomposition-based matrix completion methods; meanwhile, it reserves the innate efficiency advantages over SVD-based matrix completion methods. We empirically evaluate the proposed algorithm ABSS on real visual image data and large-scale recommendation datasets. Results have shown that ABSS has much better reconstruction accuracy with comparable cost to atom-decomposition-based methods. At the same time, it outperforms the state-of-the-art SVD-based matrix completion algorithms by similar or better reconstruction accuracy with enormous advantages on efficiency.

[1]  Ohad Shamir,et al.  Large-Scale Convex Minimization with a Low-Rank Constraint , 2011, ICML.

[2]  Emmanuel J. Candès,et al.  The Power of Convex Relaxation: Near-Optimal Matrix Completion , 2009, IEEE Transactions on Information Theory.

[3]  Stephen J. Wright,et al.  Sparse Reconstruction by Separable Approximation , 2008, IEEE Transactions on Signal Processing.

[4]  Yaoliang Yu,et al.  Accelerated Training for Matrix-norm Regularization: A Boosting Approach , 2012, NIPS.

[5]  Yi Zhang,et al.  Efficient bayesian hierarchical user modeling for recommendation system , 2007, SIGIR.

[6]  Jos F. Sturm,et al.  A Matlab toolbox for optimization over symmetric cones , 1999 .

[7]  Adi Shraibman,et al.  Rank, Trace-Norm and Max-Norm , 2005, COLT.

[8]  Martin J. Wainwright,et al.  Estimation of (near) low-rank matrices with noise and high-dimensional scaling , 2009, ICML.

[9]  Emmanuel J. Candès,et al.  Exact Matrix Completion via Convex Optimization , 2008, Found. Comput. Math..

[10]  Shuicheng Yan,et al.  Generalized Nonconvex Nonsmooth Low-Rank Minimization , 2014, 2014 IEEE Conference on Computer Vision and Pattern Recognition.

[11]  Stéphane Mallat,et al.  Matching pursuits with time-frequency dictionaries , 1993, IEEE Trans. Signal Process..

[12]  Michael Isard,et al.  Object retrieval with large vocabularies and fast spatial matching , 2007, 2007 IEEE Conference on Computer Vision and Pattern Recognition.

[13]  Zuowei Shen,et al.  Robust video denoising using low rank matrix completion , 2010, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[14]  Andrea Montanari,et al.  Matrix completion from a few entries , 2009, 2009 IEEE International Symposium on Information Theory.

[15]  Jieping Ye,et al.  Accelerated singular value thresholding for matrix completion , 2012, KDD.

[16]  S. Yun,et al.  An accelerated proximal gradient algorithm for nuclear norm regularized linear least squares problems , 2009 .

[17]  Bradley N. Miller,et al.  MovieLens unplugged: experiences with an occasionally connected recommender system , 2003, IUI '03.

[18]  Xuelong Li,et al.  Fast and Accurate Matrix Completion via Truncated Nuclear Norm Regularization , 2013, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[19]  Michael A. Saunders,et al.  Atomic Decomposition by Basis Pursuit , 1998, SIAM J. Sci. Comput..

[20]  Peder A. Olsen,et al.  Nuclear Norm Minimization via Active Subspace Selection , 2014, ICML.

[21]  Emmanuel J. Candès,et al.  Matrix Completion With Noise , 2009, Proceedings of the IEEE.

[22]  Yehuda Koren,et al.  Matrix Factorization Techniques for Recommender Systems , 2009, Computer.

[23]  Joel A. Tropp,et al.  Signal Recovery From Random Measurements Via Orthogonal Matching Pursuit , 2007, IEEE Transactions on Information Theory.

[24]  L. Shao,et al.  From Heuristic Optimization to Dictionary Learning: A Review and Comprehensive Comparison of Image Denoising Algorithms , 2014, IEEE Transactions on Cybernetics.

[25]  Zhihua Zhang,et al.  Nonconvex Relaxation Approaches to Robust Matrix Recovery , 2013, IJCAI.

[26]  Xuelong Li,et al.  Matrix completion by Truncated Nuclear Norm Regularization , 2012, 2012 IEEE Conference on Computer Vision and Pattern Recognition.

[27]  Yongdong Zhang,et al.  Community Discovery from Social Media by Low-Rank Matrix Recovery , 2015, ACM Trans. Intell. Syst. Technol..

[28]  Cordelia Schmid,et al.  Hamming Embedding and Weak Geometric Consistency for Large Scale Image Search , 2008, ECCV.

[29]  Jieping Ye,et al.  Rank-One Matrix Pursuit for Matrix Completion , 2014, ICML.

[30]  Laura Schweitzer Matrix Preconditioning Techniques And Applications , 2016 .

[31]  Kim-Chuan Toh,et al.  SDPT3 — a Matlab software package for semidefinite-quadratic-linear programming, version 3.0 , 2001 .

[32]  Ambuj Tewari,et al.  Stochastic methods for l1 regularized loss minimization , 2009, ICML '09.

[33]  Y. C. Pati,et al.  Orthogonal matching pursuit: recursive function approximation with applications to wavelet decomposition , 1993, Proceedings of 27th Asilomar Conference on Signals, Systems and Computers.

[34]  M. Fazel,et al.  Iterative reweighted least squares for matrix rank minimization , 2010, 2010 48th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[35]  Matthijs Douze,et al.  Large-scale image classification with trace-norm regularization , 2012, 2012 IEEE Conference on Computer Vision and Pattern Recognition.

[36]  Emmanuel J. Candès,et al.  A Singular Value Thresholding Algorithm for Matrix Completion , 2008, SIAM J. Optim..

[37]  K. Chen,et al.  Matrix preconditioning techniques and applications , 2005 .

[38]  Massimiliano Pontil,et al.  Convex multi-task feature learning , 2008, Machine Learning.

[39]  Roberto Turrin,et al.  Performance of recommender algorithms on top-n recommendation tasks , 2010, RecSys '10.

[40]  Stephen J. Wright,et al.  Forward–Backward Greedy Algorithms for Atomic Norm Regularization , 2014, IEEE Transactions on Signal Processing.

[41]  Zaïd Harchaoui,et al.  Lifted coordinate descent for learning with trace-norm regularization , 2012, AISTATS.

[42]  Jorge Nocedal,et al.  Algorithm 778: L-BFGS-B: Fortran subroutines for large-scale bound-constrained optimization , 1997, TOMS.

[43]  Tao Mei,et al.  Personalized video recommendation through tripartite graph propagation , 2012, ACM Multimedia.

[44]  Tong Zhang,et al.  Adaptive Forward-Backward Greedy Algorithm for Sparse Learning with Linear Models , 2008, NIPS.

[45]  James Bennett,et al.  The Netflix Prize , 2007 .

[46]  Benjamin Recht,et al.  A Simpler Approach to Matrix Completion , 2009, J. Mach. Learn. Res..

[47]  Jieping Ye,et al.  Forward-Backward Greedy Algorithms for General Convex Smooth Functions over A Cardinality Constraint , 2013, ICML.