Group-level support recovery guarantees for group lasso estimator

This paper considers the problem of estimating an unknown high dimensional signal from (typically low-dimensional) noisy linear measurements, where the desired unknown signal is assumed to possess a group-sparse structure, i.e. given a (pre-defined) partition of its entries into groups, only a small number of such groups are non-zero. Assuming the unknown group-sparse signal is generated according to a certain statistical model, we provide guarantees under which it can be efficiently estimated via solving the well-known group Lasso problem. In particular, we demonstrate that the set of indices for non-zero groups of the signal (called the group-level support of the signal) can be exactly recovered by solving the proposed group Lasso problem provided that its constituent non-zero groups are small in number and possess enough energy. Our guarantees rely on the well-conditioning of measurement matrix, which is expressed in terms of the block coherence parameter and can be efficiently computed. Our results are non-asymptotic in nature and therefore applicable to practical scenarios.

[1]  J. Lafferty,et al.  Sparse additive models , 2007, 0711.4555.

[2]  Hal Daumé,et al.  Learning Task Grouping and Overlap in Multi-task Learning , 2012, ICML.

[3]  Michael I. Jordan,et al.  Union support recovery in high-dimensional multivariate regression , 2008, 2008 46th Annual Allerton Conference on Communication, Control, and Computing.

[4]  Martin J. Wainwright,et al.  Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$ -Constrained Quadratic Programming (Lasso) , 2009, IEEE Transactions on Information Theory.

[5]  Emmanuel J. Candès,et al.  Exact Matrix Completion via Convex Optimization , 2009, Found. Comput. Math..

[6]  Junzhou Huang,et al.  The Benefit of Group Sparsity , 2009 .

[7]  Jarvis D. Haupt,et al.  Locating rare and weak material anomalies by convex demixing of propagating wavefields , 2015, 2015 IEEE 6th International Workshop on Computational Advances in Multi-Sensor Adaptive Processing (CAMSAP).

[8]  A. Rinaldo,et al.  On the asymptotic properties of the group lasso estimator for linear models , 2008 .

[9]  Pablo A. Parrilo,et al.  The Convex Geometry of Linear Inverse Problems , 2010, Foundations of Computational Mathematics.

[10]  David L Donoho,et al.  Compressed sensing , 2006, IEEE Transactions on Information Theory.

[11]  Han Liu,et al.  Estimation Consistency of the Group Lasso and its Applications , 2009, AISTATS.

[12]  Francis R. Bach,et al.  Consistency of the group Lasso and multiple kernel learning , 2007, J. Mach. Learn. Res..

[13]  S. Geer,et al.  Oracle Inequalities and Optimal Inference under Group Sparsity , 2010, 1007.1771.

[14]  M. Rudelson,et al.  Hanson-Wright inequality and sub-gaussian concentration , 2013 .

[15]  Martin J. Wainwright,et al.  A unified framework for high-dimensional analysis of $M$-estimators with decomposable regularizers , 2009, NIPS.

[16]  Robert D. Nowak,et al.  Universal Measurement Bounds for Structured Sparse Signal Recovery , 2012, AISTATS.

[17]  David B. Dunson,et al.  Multitask Compressive Sensing , 2009, IEEE Transactions on Signal Processing.

[18]  Yang Yu,et al.  Automatic image annotation using group sparsity , 2010, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[19]  Dustin G. Mixon,et al.  Certifying the Restricted Isometry Property is Hard , 2012, IEEE Transactions on Information Theory.

[20]  Yonina C. Eldar,et al.  Block-Sparse Signals: Uncertainty Relations and Efficient Recovery , 2009, IEEE Transactions on Signal Processing.

[21]  A. Robert Calderbank,et al.  Conditioning of Random Block Subdictionaries With Applications to Block-Sparse Recovery and Regression , 2013, IEEE Transactions on Information Theory.