Tensor completion via group-sparse regularization

To enable low-rank tensor completion and factorization, this paper puts forth a novel tensor rank regularization method based on the ℓ1,2-norm of the tensor's parallel factor analysis (PARAFAC) factors. Specifically, for an N-way tensor, upon collecting the magnitudes of its rank-1 components in a vector, the proposed regularizer controls the tensor's rank by inducing sparsity in the vector of magnitudes through ℓ1/N (pseudo)-norm regularization. Our approach favors sparser magnitude vectors than existing ℓ2/N- and ℓ1-based alternatives. With an eye towards large-scale tensor mining applications, we also develop efficient and highly scalable solvers for tensor factorization and completion using the proposed criterion. Extensive numerical tests using both synthetic and real data demonstrate that the proposed criterion is better in terms of revealing the correct number of components and estimating the underlying factors than competing alternatives.

[1]  Prateek Jain,et al.  Provable Tensor Factorization with Missing Data , 2014, NIPS.

[2]  Ming Yuan,et al.  On Tensor Completion via Nuclear Norm Minimization , 2014, Foundations of Computational Mathematics.

[3]  B. Recht,et al.  Tensor completion and low-n-rank tensor recovery via convex optimization , 2011 .

[4]  Tamara G. Kolda,et al.  Tensor Decompositions and Applications , 2009, SIAM Rev..

[5]  Parikshit Shah,et al.  Optimal Low-Rank Tensor Recovery from Separable Measurements: Four Contractions Suffice , 2015, ArXiv.

[6]  Nikos D. Sidiropoulos,et al.  Joint Tensor Factorization and Outlying Slab Suppression With Applications , 2015, IEEE Transactions on Signal Processing.

[7]  Morteza Mardani,et al.  Subspace Learning and Imputation for Streaming Big Data Matrices and Tensors , 2014, IEEE Transactions on Signal Processing.

[8]  René Vidal,et al.  Global Optimality in Tensor Factorization, Deep Learning, and Beyond , 2015, ArXiv.

[9]  Nikos D. Sidiropoulos,et al.  From K-Means to Higher-Way Co-Clustering: Multilinear Decomposition With Sparse Latent Factors , 2013, IEEE Transactions on Signal Processing.

[10]  Nikos D. Sidiropoulos,et al.  Identifiability results for blind beamforming in incoherent multipath with small delay spread , 2001, IEEE Trans. Signal Process..

[11]  Jieping Ye,et al.  Tensor Completion for Estimating Missing Values in Visual Data , 2013, IEEE Trans. Pattern Anal. Mach. Intell..

[12]  Gonzalo Mateos,et al.  Rank Regularization and Bayesian Inference for Tensor Completion and Extrapolation , 2013, IEEE Transactions on Signal Processing.

[13]  Lixin Shen,et al.  Overcomplete tensor decomposition via convex optimization , 2015, 2015 IEEE 6th International Workshop on Computational Advances in Multi-Sensor Adaptive Processing (CAMSAP).

[14]  Tamara G. Kolda,et al.  Temporal Link Prediction Using Matrix and Tensor Factorizations , 2010, TKDD.

[15]  Tamara G. Kolda,et al.  Scalable Tensor Factorizations with Missing Data , 2010, SDM.

[16]  Anima Anandkumar,et al.  A tensor approach to learning mixed membership community models , 2013, J. Mach. Learn. Res..

[17]  Christopher J. Hillar,et al.  Most Tensor Problems Are NP-Hard , 2009, JACM.

[18]  Zhi-Quan Luo,et al.  A Unified Convergence Analysis of Block Successive Minimization Methods for Nonsmooth Optimization , 2012, SIAM J. Optim..

[19]  Xi Chen,et al.  Temporal Collaborative Filtering with Bayesian Probabilistic Tensor Factorization , 2010, SDM.

[20]  Nuria Oliver,et al.  Multiverse recommendation: n-dimensional tensor factorization for context-aware collaborative filtering , 2010, RecSys '10.