Learning a Joint Low-Rank and Gaussian Model in Matrix Completion with Spectral Regularization and Expectation Maximization Algorithm

Completing a partially-known matrix, is an important problem in the field of data science and useful for many related applications, e.g., collaborative filtering for recommendation systems, global positioning in large-scale sensor networks. Low-rank and Gaussian models are two popular classes of models used in matrix completion, both of which have proven success. In this paper, we introduce a single model that leverage the features of both low-rank and Gaussian models. We develop a novel method based on Expectation Maximization (EM) that involves spectral regularization (for low-rank part) as well as maximum likelihood maximization (for learning Gaussian parameters). We also test our framework on real-world movie rating data, and provide comparison results with some of the common methods used for matrix completion.

[1]  Gang Wu,et al.  Matrix Completion under Gaussian Models Using MAP and EM Algorithms , 2017, J. Commun..

[2]  Paul Covington,et al.  Deep Neural Networks for YouTube Recommendations , 2016, RecSys.

[3]  Dima Grigoriev,et al.  Complexity of Quantifier Elimination in the Theory of Algebraically Closed Fields , 1984, MFCS.

[4]  Geoffrey E. Hinton,et al.  Restricted Boltzmann machines for collaborative filtering , 2007, ICML '07.

[5]  Emmanuel J. Candès,et al.  The Power of Convex Relaxation: Near-Optimal Matrix Completion , 2009, IEEE Transactions on Information Theory.

[6]  Robert Tibshirani,et al.  Spectral Regularization Algorithms for Learning Large Incomplete Matrices , 2010, J. Mach. Learn. Res..

[7]  Christopher Ré,et al.  Parallel stochastic gradient algorithms for large-scale matrix completion , 2013, Mathematical Programming Computation.

[8]  Stéphane Mallat,et al.  Solving Inverse Problems With Piecewise Linear Estimators: From Gaussian Mixture Models to Structured Sparsity , 2010, IEEE Transactions on Image Processing.

[9]  Stephen P. Boyd,et al.  Disciplined Convex Programming , 2006 .

[10]  Sophie Ahrens,et al.  Recommender Systems , 2012 .

[11]  Shou-De Lin,et al.  A Linear Ensemble of Individual and Blended Models for Music Rating Prediction , 2012, KDD Cup.

[12]  Patrick Seemann,et al.  Matrix Factorization Techniques for Recommender Systems , 2014 .

[13]  Emmanuel J. Candès,et al.  Exact Matrix Completion via Convex Optimization , 2009, Found. Comput. Math..

[14]  Pablo A. Parrilo,et al.  Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization , 2007, SIAM Rev..

[15]  Yu He,et al.  Statistical Significance of the Netflix Challenge , 2012, 1207.5649.

[16]  Gang Wu,et al.  Online video session progress prediction using low-rank matrix completion , 2014, 2014 IEEE International Conference on Multimedia and Expo Workshops (ICMEW).

[17]  Tommi S. Jaakkola,et al.  Weighted Low-Rank Approximations , 2003, ICML.

[18]  Guillermo Sapiro,et al.  Efficient matrix completion with Gaussian models , 2010, 2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[19]  Richard A. Harshman,et al.  Foundations of the PARAFAC procedure: Models and conditions for an "explanatory" multi-model factor analysis , 1970 .

[20]  Emmanuel J. Candès,et al.  A Singular Value Thresholding Algorithm for Matrix Completion , 2008, SIAM J. Optim..

[21]  Steffen Rendle,et al.  Factorization Machines with libFM , 2012, TIST.

[22]  Yehuda Koren,et al.  Lessons from the Netflix prize challenge , 2007, SKDD.

[23]  Andrea Montanari,et al.  Matrix completion from a few entries , 2009, ISIT.