Group Sparsity via SURE Based on Regression Parameter Mean Squared Error

Any regularization method requires the selection of a penalty parameter and many model selection criteria have been developed based on various discrepancy measures. Most of the attention has been focused on prediction mean squared error. In this paper we develop a model selection criterion based on regression parameter mean squared error via SURE (Stein's unbiased risk estimator). We then apply this to the l1 penalized least squares problem with grouped variables on over-determined systems. Simulation results based on topology identification of a sparse network are presented to illustrate and compare with alternative model selection criteria.

[1]  H. Akaike,et al.  Information Theory and an Extension of the Maximum Likelihood Principle , 1973 .

[2]  G. Schwarz Estimating the Dimension of a Model , 1978 .

[3]  Michael Unser,et al.  Recursive risk estimation for non-linear image deconvolution with a wavelet-domain sparsity constraint , 2008, 2008 15th IEEE International Conference on Image Processing.

[4]  I. Johnstone,et al.  Maximum Entropy and the Nearly Black Object , 1992 .

[5]  R. Tibshirani,et al.  Least angle regression , 2004, math/0406456.

[6]  Colin L. Mallows,et al.  Some Comments on Cp , 2000, Technometrics.

[7]  C. Stein Estimation of the Mean of a Multivariate Normal Distribution , 1981 .

[8]  Charles Dossal,et al.  Local Behavior of Sparse Analysis Regularization: Applications to Risk Estimation , 2012, 1204.3212.

[9]  Walter Zucchini,et al.  Model Selection , 2011, International Encyclopedia of Statistical Science.

[10]  Victor Solo A sure-fired way to choose smoothing parameters in ill-conditioned inverse problems , 1996, Proceedings of 3rd IEEE International Conference on Image Processing.

[11]  M. Yuan,et al.  Model selection and estimation in regression with grouped variables , 2006 .

[12]  Victor Solo,et al.  Rank selection in noist PCA with sure and random matrix theory , 2008, 2008 IEEE International Conference on Acoustics, Speech and Signal Processing.

[13]  Amel Benazza-Benyahia,et al.  A SURE Approach for Digital Signal/Image Deconvolution Problems , 2008, IEEE Transactions on Signal Processing.

[14]  Thierry Blu,et al.  Monte-Carlo Sure: A Black-Box Optimization of Regularization Parameters for General Denoising Algorithms , 2008, IEEE Transactions on Image Processing.

[15]  R. Tibshirani Regression Shrinkage and Selection via the Lasso , 1996 .

[16]  David Hinkley,et al.  Bootstrap Methods: Another Look at the Jackknife , 2008 .

[17]  Stefano Alliney,et al.  An algorithm for the minimization of mixed l1 and l2 norms with application to Bayesian estimation , 1994, IEEE Trans. Signal Process..

[18]  H. Malcolm Hudson,et al.  Maximum likelihood restoration and choice of smoothing parameter in deconvolution of image data subject to Poisson noise , 1998 .

[19]  R. Tibshirani,et al.  On the “degrees of freedom” of the lasso , 2007, 0712.0881.

[20]  Victor Solo,et al.  Threshold selection for group sparsity , 2010, 2010 IEEE International Conference on Acoustics, Speech and Signal Processing.

[21]  M. Stone Cross‐Validatory Choice and Assessment of Statistical Predictions , 1976 .

[22]  Yonina C. Eldar,et al.  The Projected GSURE for Automatic Parameter Tuning in Iterative Shrinkage Methods , 2010, Applied and Computational Harmonic Analysis.

[23]  Yonina C. Eldar Generalized SURE for Exponential Families: Applications to Regularization , 2008, IEEE Transactions on Signal Processing.