The 2-codeword screening test for lasso problems

Solving a lasso problem is a practical approach for acquiring a sparse representation of a signal with respect to a given dictionary. Driven by the demand for sparse representations over large-scale data in machine learning and statistics, we explore lasso screening tests. These enhance solution efficiency via the elimination of codewords absent in the optimal solution prior to detailed computation. On basis of the concept of a region test and the recently introduced dome test, we propose the 2-codeword test, which uses two codewords together in a correlation screening test. In addition to the rejection rate as the performance measure, we introduce an innovative way to access the performance of a screening test, called the uncertainty measure, via a comparison with the optimal test.

[1]  Kristiaan Pelckmans,et al.  An ellipsoid based, two-stage screening test for BPDN , 2012, 2012 Proceedings of the 20th European Signal Processing Conference (EUSIPCO).

[2]  Le Li,et al.  SENSC: a Stable and Efficient Algorithm for Nonnegative Sparse Coding: SENSC: a Stable and Efficient Algorithm for Nonnegative Sparse Coding , 2009 .

[3]  Zhen James Xiang,et al.  Combining Structural Knowledge with Sparsity in Machine Learning and Signal Processing , 2012 .

[4]  Hao Xu,et al.  Learning Sparse Representations of High Dimensional Data on Large Scale Dictionaries , 2011, NIPS.

[5]  Yoshua Bengio,et al.  Gradient-based learning applied to document recognition , 1998, Proc. IEEE.

[6]  Hossein Mobahi,et al.  Toward a Practical Face Recognition System: Robust Alignment and Illumination by Sparse Representation , 2012, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[7]  Yihong Gong,et al.  Nonlinear Learning using Local Coordinate Coding , 2009, NIPS.

[8]  Yun Wang,et al.  Tradeoffs in improved screening of lasso problems , 2013, 2013 IEEE International Conference on Acoustics, Speech and Signal Processing.

[9]  Jie Wang,et al.  Lasso screening rules via dual polytope projection , 2012, J. Mach. Learn. Res..

[10]  David J. Kriegman,et al.  From Few to Many: Illumination Cone Models for Face Recognition under Variable Lighting and Pose , 2001, IEEE Trans. Pattern Anal. Mach. Intell..

[11]  Laurent El Ghaoui,et al.  Safe Feature Elimination in Sparse Supervised Learning , 2010, ArXiv.

[12]  R. Tibshirani,et al.  Strong rules for discarding predictors in lasso‐type problems , 2010, Journal of the Royal Statistical Society. Series B, Statistical methodology.

[13]  Rajat Raina,et al.  Efficient sparse coding algorithms , 2006, NIPS.

[14]  Peter J. Ramadge,et al.  Fast lasso screening tests based on correlations , 2012, 2012 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).