Square Deal : Lower Bounds and Improved Convex Relaxations for Tensor Recovery

Recovering a low-rank tensor from incomplete information is a recurring problem in signal processing andmachine learning. The most popular convex relaxation of this problem minimizes the sum of the nuclear norms (SNN) of the unfolding matrices of the tensor. We show that this approach can be substantially suboptimal: reliably recovering a Kway n⇥n⇥· · ·⇥n tensor of Tucker rank (r, r, . . . , r) from Gaussian measurements requires ⌦(rnK 1) observations. In contrast, a certain (intractable) nonconvex formulation needs only O(rK +nrK) observations. We introduce a simple, new convex relaxation, which partially bridges this gap. Our new formulation succeeds with O(rbK/2cndK/2e) observations. The lower bound for the SNNmodel follows from our new result on recovering signals with multiple structures (e.g. sparse, low rank), which indicates the significant suboptimality of the common approach of minimizing the sum of individual sparsity inducing norms (e.g. `1, nuclear norm). Our new tractable formulation for low-rank tensor recovery shows how the sample complexity can be reduced by designing convex regularizers that exploit several structures jointly.

[1]  L. Tucker,et al.  Some mathematical notes on three-mode factor analysis , 1966, Psychometrika.

[2]  E. Candès,et al.  Stable signal recovery from incomplete and inaccurate measurements , 2005, math/0503066.

[3]  Stephen P. Boyd,et al.  Convex Optimization , 2004, Algorithms and Theory of Computation Handbook.

[4]  Nima Mesgarani,et al.  Discrimination of speech from nonspeech based on multiscale spectro-temporal Modulations , 2006, IEEE Transactions on Audio, Speech, and Language Processing.

[5]  Jieping Ye,et al.  Tensor Completion for Estimating Missing Values in Visual Data , 2009, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[6]  Martin J. Wainwright,et al.  A unified framework for high-dimensional analysis of $M$-estimators with decomposable regularizers , 2009, NIPS.

[7]  Tamara G. Kolda,et al.  Tensor Decompositions and Applications , 2009, SIAM Rev..

[8]  Pablo A. Parrilo,et al.  Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization , 2007, SIAM Rev..

[9]  Baoxin Li,et al.  Tensor completion for on-board compression of hyperspectral images , 2010, 2010 IEEE International Conference on Image Processing.

[10]  Yi Ma,et al.  The Augmented Lagrange Multiplier Method for Exact Recovery of Corrupted Low-Rank Matrices , 2010, Journal of structural biology.

[11]  Yin Li,et al.  Optimum Subspace Learning and Error Correction for Tensors , 2010, ECCV.

[12]  Nikos D. Sidiropoulos,et al.  Tensor Algebra and Multidimensional Harmonic Retrieval in Signal Processing for MIMO Radar , 2010, IEEE Transactions on Signal Processing.

[13]  Hisashi Kashima,et al.  Statistical Performance of Convex Tensor Decomposition , 2011, NIPS.

[14]  J. Suykens,et al.  Nuclear Norms for Tensors and Their Use for Convex Multilinear Estimation , 2011 .

[15]  Dennis Amelunxen Geometric analysis of the condition of the convex feasibility problem , 2011 .

[16]  David Gross,et al.  Recovering Low-Rank Matrices From Few Coefficients in Any Basis , 2009, IEEE Transactions on Information Theory.

[17]  Heinz H. Bauschke,et al.  Convex Analysis and Monotone Operator Theory in Hilbert Spaces , 2011, CMS Books in Mathematics.

[18]  Patrick L. Combettes,et al.  Proximal Splitting Methods in Signal Processing , 2009, Fixed-Point Algorithms for Inverse Problems in Science and Engineering.

[19]  Yonina C. Eldar,et al.  Uniqueness conditions for low-rank matrix recovery , 2011, Optical Engineering + Applications.

[20]  B. Recht,et al.  Tensor completion and low-n-rank tensor recovery via convex optimization , 2011 .

[21]  Andrea Montanari,et al.  Universality in Polytope Phase Transitions and Message Passing Algorithms , 2012, ArXiv.

[22]  Roman Vershynin,et al.  Introduction to the non-asymptotic analysis of random matrices , 2010, Compressed Sensing.

[23]  Pablo A. Parrilo,et al.  The Convex Geometry of Linear Inverse Problems , 2010, Foundations of Computational Mathematics.

[24]  Yi Ma,et al.  Repairing Sparse Low-Rank Texture , 2012, ECCV.

[25]  Shay B. Cohen,et al.  Tensor Decomposition for Fast Parsing with Latent-Variable PCFGs , 2012, NIPS.

[26]  Shuchin Aeron,et al.  5D and 4D pre-stack seismic data completion using tensor nuclear norm (TNN) , 2013, SEG Technical Program Expanded Abstracts 2013.

[27]  Nadia Kreimer,et al.  Nuclear norm minimization and tensor completion in exploration seismology , 2013, 2013 IEEE International Conference on Acoustics, Speech and Signal Processing.

[28]  Francis R. Bach,et al.  Intersecting singularities for multi-structured estimation , 2013, ICML.

[29]  Massimiliano Pontil,et al.  Multilinear Multitask Learning , 2013, ICML.

[30]  Wotao Yin,et al.  Parallel matrix factorization for low-rank tensor completion , 2013, ArXiv.

[31]  Gert R. G. Lanckriet,et al.  Robust Structural Metric Learning , 2013, ICML.

[32]  Michael B. McCoy A geometric analysis of convex demixing , 2013 .

[33]  Johan A. K. Suykens,et al.  Learning with tensors: a framework based on convex optimization and spectral regularization , 2014, Machine Learning.

[34]  Christopher J. Hillar,et al.  Most Tensor Problems Are NP-Hard , 2009, JACM.

[35]  Joel A. Tropp,et al.  Living on the edge: phase transitions in convex programs with random data , 2013, 1303.6672.

[36]  Bart Vandereycken,et al.  Low-rank tensor completion by Riemannian optimization , 2014 .

[37]  Bo Huang,et al.  Square Deal: Lower Bounds and Improved Relaxations for Tensor Recovery , 2013, ICML.

[38]  Prateek Jain,et al.  Provable Tensor Factorization with Missing Data , 2014, NIPS.

[39]  John Wright,et al.  Provable Low-Rank Tensor Recovery , 2014 .

[40]  Donald Goldfarb,et al.  Robust Low-Rank Tensor Recovery: Models and Algorithms , 2013, SIAM J. Matrix Anal. Appl..

[41]  Jean-Philippe Vert,et al.  Tight convex relaxations for sparse matrix factorization , 2014, NIPS.

[42]  Anil Aswani Positive Low-Rank Tensor Completion , 2014, ArXiv.

[43]  Anima Anandkumar,et al.  Tensor decompositions for learning latent variable models , 2012, J. Mach. Learn. Res..

[44]  Eric L. Miller,et al.  Tensor-Based Formulation and Nuclear Norm Regularization for Multienergy Computed Tomography , 2013, IEEE Transactions on Image Processing.

[45]  Yonina C. Eldar,et al.  Simultaneously Structured Models With Application to Sparse and Low-Rank Matrices , 2012, IEEE Transactions on Information Theory.

[46]  S. Frick,et al.  Compressed Sensing , 2014, Computer Vision, A Reference Guide.

[47]  Babak Hassibi,et al.  Asymptotically Exact Denoising in Relation to Compressed Sensing , 2013, ArXiv.

[48]  Ming Yuan,et al.  On Tensor Completion via Nuclear Norm Minimization , 2014, Foundations of Computational Mathematics.