Structured Matrix Recovery via the Generalized Dantzig Selector

In recent years, structured matrix recovery problems have gained considerable attention for its real world applications, such as recommender systems and computer vision. Much of the existing work has focused on matrices with low-rank structure, and limited progress has been made matrices with other types of structure. In this paper we present non-asymptotic analysis for estimation of generally structured matrices via the generalized Dantzig selector under generic sub-Gaussian measurements. We show that the estimation error can always be succinctly expressed in terms of a few geometric measures of suitable sets which only depend on the structure of the underlying true matrix. In addition, we derive the general bounds on these geometric measures for structures characterized by unitarily invariant norms, which is a large family covering most matrix norms of practical interest. Examples are provided to illustrate the utility of our theoretical development.

[1]  Yi Ma,et al.  Robust principal component analysis? , 2009, JACM.

[2]  Aaas News,et al.  Book Reviews , 1893, Buffalo Medical and Surgical Journal.

[3]  Sjoerd Dirksen,et al.  Dimensionality Reduction with Subgaussian Matrices: A Unified Theory , 2014, Foundations of Computational Mathematics.

[4]  Y. Gordon On Milman's inequality and random subspaces which escape through a mesh in ℝ n , 1988 .

[5]  Tengyuan Liang,et al.  Geometrizing Local Rates of Convergence for High-Dimensional Linear Inverse Problems , 2014 .

[6]  Emmanuel J. Candès,et al.  Exact Matrix Completion via Convex Optimization , 2008, Found. Comput. Math..

[7]  Tengyuan Liang,et al.  Geometric Inference for General High-Dimensional Linear Inverse Problems , 2014, 1404.4408.

[8]  Joydeep Ghosh,et al.  Unified View of Matrix Completion under General Structural Constraints , 2015, NIPS.

[9]  Anru Zhang,et al.  ROP: Matrix Recovery via Rank-One Projections , 2013, ArXiv.

[10]  Emmanuel J. Candès,et al.  Tight Oracle Inequalities for Low-Rank Matrix Recovery From a Minimal Number of Noisy Random Measurements , 2011, IEEE Transactions on Information Theory.

[11]  Vidyashankar Sivakumar,et al.  Estimation with Norm Regularization , 2014, NIPS.

[12]  Yaoliang Yu,et al.  Polar Operators for Structured Sparse Estimation , 2013, NIPS.

[13]  A. Lewis The Convex Analysis of Unitarily Invariant Matrix Functions , 1995 .

[14]  Robert D. Nowak,et al.  Ordered Weighted L1 Regularized Regression with Strongly Correlated Covariates: Theoretical Aspects , 2016, AISTATS.

[15]  Pablo A. Parrilo,et al.  The Convex Geometry of Linear Inverse Problems , 2010, Foundations of Computational Mathematics.

[16]  Michael A. Saunders,et al.  Atomic Decomposition by Basis Pursuit , 1998, SIAM J. Sci. Comput..

[17]  Avishai Wagner,et al.  Low-Rank Matrix Recovery from Row-and-Column Affine Measurements , 2015, ICML.

[18]  Sjoerd Dirksen,et al.  Toward a unified theory of sparse dimensionality reduction in Euclidean space , 2013, STOC.

[19]  M. Rudelson,et al.  Non-asymptotic theory of random matrices: extreme singular values , 2010, 1003.2990.

[20]  Joel A. Tropp,et al.  Living on the edge: phase transitions in convex programs with random data , 2013, 1303.6672.

[21]  M. Talagrand Upper and Lower Bounds for Stochastic Processes , 2021, Ergebnisse der Mathematik und ihrer Grenzgebiete. 3. Folge / A Series of Modern Surveys in Mathematics.

[22]  Patrick Seemann,et al.  Matrix Factorization Techniques for Recommender Systems , 2014 .

[23]  Nathan Srebro,et al.  Sparse Prediction with the $k$-Support Norm , 2012, NIPS.

[24]  Yehuda Koren,et al.  Matrix Factorization Techniques for Recommender Systems , 2009, Computer.

[25]  Arindam Banerjee,et al.  Structured Estimation with Atomic Norms: General Bounds and Applications , 2015, NIPS.

[26]  Sjoerd Dirksen,et al.  Tail bounds via generic chaining , 2013, ArXiv.

[27]  M. Talagrand A simple proof of the majorizing measure theorem , 1992 .

[28]  Roman Vershynin,et al.  Introduction to the non-asymptotic analysis of random matrices , 2010, Compressed Sensing.

[29]  Pradeep Ravikumar,et al.  Exponential Family Matrix Completion under Structural Constraints , 2014, ICML.

[30]  S. Mendelson,et al.  Reconstruction and Subgaussian Operators in Asymptotic Geometric Analysis , 2007 .

[31]  Pradeep Ravikumar,et al.  Beyond Sub-Gaussian Measurements: High-Dimensional Structured Estimation with Sub-Exponential Designs , 2015, NIPS.

[32]  M. Talagrand The Generic Chaining , 2005 .

[33]  Y. Gordon Some inequalities for Gaussian processes and applications , 1985 .

[34]  Massimiliano Pontil,et al.  Spectral k-Support Norm Regularization , 2014, NIPS.

[35]  R. Bhatia Matrix Analysis , 1996 .

[36]  H. Zou,et al.  Addendum: Regularization and variable selection via the elastic net , 2005 .

[37]  Pablo A. Parrilo,et al.  Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization , 2007, SIAM Rev..

[38]  H. Zou,et al.  Regularization and variable selection via the elastic net , 2005 .

[39]  T. T. Cai,et al.  Geometrizing Local Rates of Convergence for Linear Inverse Problems , 2014 .

[40]  Arindam Banerjee,et al.  Generalized Dantzig Selector: Application to the k-support norm , 2014, NIPS.