Sparse regularization on thin grids I: the Lasso

This article analyzes the recovery performance in the presence of noise of sparse L1 regularization, which is often referred to as the Lasso or Basis-Pursuit. We study the behavior of the method for inverse problems regularization when the discretization step size tends to zero. We assume that the sought after sparse sum of Diracs is recovered when there is no noise (a condition which has been thoroughly studied in the literature) and we study what is the support (in particular the number of Dirac masses) estimated by the Lasso when noise is added to the observation. We identify a precise non-degeneracy condition that guarantees that the recovered support is close to the initial one. More precisely, we show that, in the small noise regime, when the non-degeneracy condition holds, this method estimates twice the number of spikes as the number of original spikes. Indeed, we prove that the Lasso detects two neighboring spikes around each location of an original spike. While this paper is focussed on cases where the observations vary smoothly with the spikes locations (e.g. the deconvolution problem with a smooth kernel), an interesting by-product is an abstract analysis of the support stability of discrete L1 regularization, which is of an independent interest. We illustrate the usefulness of this abstract analysis to analyze for the first time the support instability of compressed sensing recovery.

[1]  J. Claerbout,et al.  Robust Modeling With Erratic Data , 1973 .

[2]  D. Donoho Superresolution via sparsity constraints , 1992 .

[3]  G. D. Maso,et al.  An Introduction to-convergence , 1993 .

[4]  Etienne Barnard,et al.  Two-dimensional superresolution radar imaging using the MUSIC algorithm , 1994 .

[5]  R. Tibshirani Regression Shrinkage and Selection via the Lasso , 1996 .

[6]  Michael A. Saunders,et al.  Atomic Decomposition by Basis Pursuit , 1998, SIAM J. Sci. Comput..

[7]  Georg Still,et al.  Discretization in semi-infinite programming: the rate of convergence , 2001, Math. Program..

[8]  Jean-Jacques Fuchs,et al.  On sparse representations in arbitrary redundant bases , 2004, IEEE Transactions on Information Theory.

[9]  S. Osher,et al.  Convergence rates of convex variational regularization , 2004 .

[10]  Martin J. Wainwright,et al.  Sharp thresholds for high-dimensional and noisy recovery of sparsity , 2006, ArXiv.

[11]  Peng Zhao,et al.  On Model Selection Consistency of Lasso , 2006, J. Mach. Learn. Res..

[12]  E.J. Candes,et al.  An Introduction To Compressive Sampling , 2008, IEEE Signal Processing Magazine.

[13]  M. Vetterli,et al.  Sparse Sampling of Signal Innovations , 2008, IEEE Signal Processing Magazine.

[14]  Marco Righero,et al.  An introduction to compressive sensing , 2009 .

[15]  Martin J. Wainwright,et al.  Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$ -Constrained Quadratic Programming (Lasso) , 2009, IEEE Transactions on Information Theory.

[16]  Gongguo Tang,et al.  Atomic Norm Denoising With Applications to Line Spectral Estimation , 2012, IEEE Transactions on Signal Processing.

[17]  O. Scherzer,et al.  Necessary and sufficient conditions for linear convergence of ℓ1‐regularization , 2011 .

[18]  Mohamed-Jalal Fadili,et al.  Sharp Support Recovery from Noisy Random Measurements by L1 minimization , 2011, ArXiv.

[19]  Wolfgang Desch,et al.  Progress in nonlinear differential equations and their applications, Vol. 80 , 2011 .

[20]  Yohann de Castro,et al.  Exact Reconstruction using Beurling Minimal Extrapolation , 2011, 1103.4951.

[21]  Pablo A. Parrilo,et al.  The Convex Geometry of Linear Inverse Problems , 2010, Foundations of Computational Mathematics.

[22]  Emmanuel J. Candès,et al.  Towards a Mathematical Theory of Super‐resolution , 2012, ArXiv.

[23]  C. Dossal A necessary and sufficient condition for exact recovery by l1 minimization. , 2012 .

[24]  Emmanuel J. Candès,et al.  Super-Resolution from Noisy Data , 2012, Journal of Fourier Analysis and Applications.

[25]  K. Bredies,et al.  Inverse problems in spaces of measures , 2013 .

[26]  Joel A. Tropp,et al.  Living on the edge: A geometric theory of phase transitions in convex optimization , 2013, ArXiv.

[27]  Jalal M. Fadili,et al.  Model Selection with Low Complexity Priors , 2013, 1307.2342.

[28]  Carlos Fernandez-Granda Support detection in super-resolution , 2013, ArXiv.

[29]  Gongguo Tang,et al.  Sparse recovery over continuous dictionaries-just discretize , 2013, 2013 Asilomar Conference on Signals, Systems and Computers.

[30]  Gongguo Tang,et al.  Near minimax line spectral estimation , 2013, 2013 47th Annual Conference on Information Sciences and Systems (CISS).

[31]  Joel A. Tropp,et al.  Living on the edge: phase transitions in convex programs with random data , 2013, 1303.6672.

[32]  F. Gamboa,et al.  Spike detection from inaccurate samplings , 2013, 1301.5873.

[33]  Nicolas Olivier,et al.  FALCON: fast and unbiased reconstruction of high-density super-resolution microscopy data , 2014, Scientific Reports.

[34]  Bastian Goldlücke,et al.  Variational Analysis , 2014, Computer Vision, A Reference Guide.

[35]  G. Peyré,et al.  Asymptotic of Sparse Support Recovery for Positive Measures , 2015 .

[36]  Gabriel Peyré,et al.  Support Recovery for Sparse Deconvolution of Positive Measures , 2015, ArXiv.

[37]  Gabriel Peyré,et al.  Exact Support Recovery for Sparse Spikes Deconvolution , 2013, Foundations of Computational Mathematics.

[38]  Mohamed-Jalal Fadili,et al.  Model Consistency of Partly Smooth Regularizers , 2014, IEEE Transactions on Information Theory.

[39]  C. Dossal SPARSE SPIKE DECONVOLUTION WITH MINIMUM SCALE , .