Adapting to unknown noise level in sparse deconvolution

In this paper, we study sparse spike deconvolution over the space of complex-valued measures when the input measure is a finite sum of Dirac masses. We introduce a modified version of the Beurling Lasso (BLasso), a semi-definite program that we refer to as the Concomitant Beurling Lasso (CBLasso). This new procedure estimates the target measure and the unknown noise level simultaneously. Contrary to previous estimators in the literature, theory holds for a tuning parameter that depends only on the sample size, so that it can be used for unknown noise level problems. Consistent noise level estimation is standardly proved. As for Radon measure estimation, theoretical guarantees match the previous state-of-the-art results in Super-Resolution regarding minimax prediction and localization. The proofs are based on a bound on the noise level given by a new tail estimate of the supremum of a stationary non-Gaussian process through the Rice method.

[1]  Jianqing Fan,et al.  Comments on: ℓ1-penalization for mixture regression models , 2010 .

[2]  Henning Biermann,et al.  Smoothness of Subdivision Surfaces with Boundary , 2015 .

[3]  B. Ripley,et al.  Robust Statistics , 2018, Encyclopedia of Mathematical Geosciences.

[4]  Emmanuel J. Candès,et al.  Super-Resolution from Noisy Data , 2012, Journal of Fourier Analysis and Applications.

[5]  A. Antoniadis Comments on: ℓ1-penalization for mixture regression models , 2010 .

[6]  Heinz H. Bauschke,et al.  Convex Analysis and Monotone Operator Theory in Hilbert Spaces , 2011, CMS Books in Mathematics.

[7]  K. Bredies,et al.  Inverse problems in spaces of measures , 2013 .

[8]  Gabriel Peyré,et al.  Exact Support Recovery for Sparse Spikes Deconvolution , 2013, Foundations of Computational Mathematics.

[9]  Cun-Hui Zhang,et al.  Scaled sparse linear regression , 2011, 1104.4595.

[10]  Parikshit Shah,et al.  Compressed Sensing Off the Grid , 2012, IEEE Transactions on Information Theory.

[11]  Carlos Fernandez-Granda,et al.  Super-resolution of point sources via convex programming , 2015, 2015 IEEE 6th International Workshop on Computational Advances in Multi-Sensor Adaptive Processing (CAMSAP).

[12]  Michael A. Saunders,et al.  Atomic Decomposition by Basis Pursuit , 1998, SIAM J. Sci. Comput..

[13]  Shie Mannor,et al.  Robust Regression and Lasso , 2008, IEEE Transactions on Information Theory.

[14]  Shai Dekel,et al.  Exact recovery of non-uniform splines from the projection onto spaces of algebraic polynomials , 2014, J. Approx. Theory.

[15]  Gabriel Peyré,et al.  Sparse Spikes Deconvolution on Thin Grids , 2015, ArXiv.

[16]  Shai Dekel,et al.  Robust Recovery of Stream of Pulses using Convex Optimization , 2014, ArXiv.

[17]  Gongguo Tang,et al.  Near minimax line spectral estimation , 2013, 2013 47th Annual Conference on Information Sciences and Systems (CISS).

[18]  A. Dalalyan,et al.  On the Prediction Performance of the Lasso , 2014, 1402.1700.

[19]  Shai Dekel,et al.  Super-Resolution on the Sphere Using Convex Optimization , 2014, IEEE Transactions on Signal Processing.

[20]  A. Owen A robust hybrid of lasso and ridge regression , 2006 .

[21]  A. Belloni,et al.  Square-Root Lasso: Pivotal Recovery of Sparse Signals via Conic Programming , 2010, 1009.5689.

[22]  P. Massart,et al.  Adaptive estimation of a quadratic functional by model selection , 2000 .

[23]  Gabriel Peyré,et al.  Support Recovery for Sparse Deconvolution of Positive Measures , 2015, ArXiv.

[24]  Gabriel Peyré,et al.  The non degenerate source condition: Support robustness for discrete and continuous sparse deconvolution , 2015, 2015 IEEE 6th International Workshop on Computational Advances in Multi-Sensor Adaptive Processing (CAMSAP).

[25]  F. Gamboa,et al.  Spike detection from inaccurate samplings , 2013, 1301.5873.

[26]  Shai Dekel,et al.  Exact Recovery of Dirac Ensembles from the Projection Onto Spaces of Spherical Harmonics , 2014, ArXiv.

[27]  Peter J. Huber,et al.  Robust Statistics , 2005, Wiley Series in Probability and Statistics.

[28]  Sara van de Geer,et al.  Ecole d'été de probabilités de Saint-Flour XLV , 2016 .

[29]  Stephen P. Boyd,et al.  Recent Advances in Learning and Control , 2008, Lecture Notes in Control and Information Sciences.

[30]  B. Dumitrescu Positive Trigonometric Polynomials and Signal Processing Applications , 2007 .

[31]  Vladimir Koltchinskii,et al.  $L_1$-Penalization in Functional Linear Regression with Subgaussian Design , 2013, 1307.8137.

[32]  Emmanuel J. Candès,et al.  Towards a Mathematical Theory of Super‐resolution , 2012, ArXiv.

[33]  R. Tibshirani,et al.  A Study of Error Variance Estimation in Lasso Regression , 2013, 1311.5274.

[34]  Badri Narayan Bhaskar,et al.  Compressed Sensing o the Grid , 2013 .

[35]  Stephen P. Boyd,et al.  Convex Optimization , 2004, Algorithms and Theory of Computation Handbook.

[36]  Alexandre Gramfort,et al.  Efficient Smoothed Concomitant Lasso Estimation for High Dimensional Regression , 2016, ArXiv.

[37]  S. Geer,et al.  ℓ1-penalization for mixture regression models , 2010, 1202.6046.

[38]  Y. D. Castro,et al.  Non-uniform spline recovery from small degree polynomial approximation , 2014, 1402.5662.

[39]  Yohann de Castro,et al.  Exact Reconstruction using Beurling Minimal Extrapolation , 2011, 1103.4951.

[40]  J. Azaïs,et al.  Level Sets and Extrema of Random Processes and Fields , 2009 .

[41]  Carlos Fernandez-Granda Support detection in super-resolution , 2013, ArXiv.

[42]  E. Candès,et al.  Near-ideal model selection by ℓ1 minimization , 2008, 0801.0345.

[43]  P. Borwein,et al.  Polynomials and Polynomial Inequalities , 1995 .