LASSO risk and phase transition under dependence

We consider the problem of recovering a k-sparse signal β0 ∈ Rp from noisy observations y = Xβ0+w ∈ Rn. One of the most popular approaches is the l1-regularized least squares, also known as LASSO. We analyze the mean square error of LASSO in the case of random designs in which each row of X is drawn from distribution N(0,Σ) with general Σ. We first derive the asymptotic risk of LASSO in the limit of n, p → ∞ with n/p → δ. We then examine conditions on n, p, and k for LASSO to exactly reconstruct β0 in the noiseless case w = 0. A phase boundary δc = δ(ǫ) is precisely established in the phase space defined by 0 ≤ δ, ǫ ≤ 1, where ǫ = k/p. Above this boundary, LASSO perfectly recovers β0 with high probability. Below this boundary, LASSO fails to recover β0 with high probability. While the values of the non-zero elements of β0 do not have any effect on the phase transition curve, our analysis shows that δc does depend on the signed pattern of the nonzero values of β0 for general Σ 6= Ip. This is in sharp contrast to the previous phase transition results derived in i.i.d. case with Σ = Ip where δc is completely determined by ǫ regardless of the distribution of β0. Underlying our formalism is a recently developed efficient algorithm called approximate message passing (AMP) algorithm. We generalize the state evolution of AMP from i.i.d. case to general case with Σ 6= Ip. Extensive computational experiments confirm that our theoretical predictions are consistent with simulation results on moderate size system.

[1]  David L. Donoho,et al.  Observed universality of phase transitions in high-dimensional geometry, with implications for modern data analysis and signal processing , 2009, Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences.

[2]  Adel Javanmard,et al.  Hypothesis Testing in High-Dimensional Regression Under the Gaussian Random Design Model: Asymptotic Theory , 2013, IEEE Transactions on Information Theory.

[3]  S. Kak Information, physics, and computation , 1996 .

[4]  Galen Reeves,et al.  The replica-symmetric prediction for compressed sensing with Gaussian matrices is exact , 2016, 2016 IEEE International Symposium on Information Theory (ISIT).

[5]  J WainwrightMartin Sharp thresholds for high-dimensional and noisy sparsity recovery using l1-constrained quadratic programming (Lasso) , 2009 .

[6]  Andrea Montanari,et al.  The LASSO Risk for Gaussian Matrices , 2010, IEEE Transactions on Information Theory.

[7]  Andrea Montanari,et al.  Universality in Polytope Phase Transitions and Message Passing Algorithms , 2012, ArXiv.

[8]  Andrea Montanari,et al.  The Noise-Sensitivity Phase Transition in Compressed Sensing , 2010, IEEE Transactions on Information Theory.

[9]  D. Donoho,et al.  Sparse nonnegative solution of underdetermined linear equations by linear programming. , 2005, Proceedings of the National Academy of Sciences of the United States of America.

[10]  N. Macris,et al.  The adaptive interpolation method: a simple scheme to prove replica formulas in Bayesian inference , 2018, Probability Theory and Related Fields.

[11]  Yoshiyuki Kabashima,et al.  Erratum: A typical reconstruction limit of compressed sensing based on Lp-norm minimization , 2009, ArXiv.

[12]  Florent Krzakala,et al.  Statistical physics-based reconstruction in compressed sensing , 2011, ArXiv.

[13]  Richard G. Baraniuk,et al.  Asymptotic Analysis of Complex LASSO via Complex Approximate Message Passing (CAMP) , 2011, IEEE Transactions on Information Theory.

[14]  Andrea Montanari,et al.  The dynamics of message passing on dense graphs, with applications to compressed sensing , 2010, 2010 IEEE International Symposium on Information Theory.

[15]  Sundeep Rangan,et al.  Generalized approximate message passing for estimation with random linear mixing , 2010, 2011 IEEE International Symposium on Information Theory Proceedings.

[16]  Nicolas Macris,et al.  Optimal errors and phase transitions in high-dimensional generalized linear models , 2017, Proceedings of the National Academy of Sciences.

[17]  Andrea Montanari,et al.  State Evolution for Approximate Message Passing with Non-Separable Functions , 2017, Information and Inference: A Journal of the IMA.

[18]  Sundeep Rangan,et al.  Asymptotic Analysis of MAP Estimation via the Replica Method and Compressed Sensing , 2009, NIPS.

[19]  Andrea Montanari,et al.  Message-passing algorithms for compressed sensing , 2009, Proceedings of the National Academy of Sciences.

[20]  Dongning Guo,et al.  A single-letter characterization of optimal noisy compressed sensing , 2009, 2009 47th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[21]  Adel Javanmard,et al.  State Evolution for General Approximate Message Passing Algorithms, with Applications to Spatial Coupling , 2012, ArXiv.

[22]  Andrea Montanari,et al.  Accurate Prediction of Phase Transitions in Compressed Sensing via a Connection to Minimax Denoising , 2011, IEEE Transactions on Information Theory.

[23]  Arian Maleki,et al.  Overcoming The Limitations of Phase Transition by Higher Order Analysis of Regularization Techniques , 2016, The Annals of Statistics.