USPACOR: Universal sparsity-controlling outlier rejection

The recent upsurge of research toward compressive sampling and parsimonious signal representations hinges on signals being sparse, either naturally, or, after projecting them on a proper basis. The present paper introduces a neat link between sparsity and a fundamental aspect of statistical inference, namely that of robustness against outliers, even when the signals involved are not sparse. It is argued that controlling sparsity of model residuals leads to statistical learning algorithms that are computationally affordable and universally robust to outlier models. Analysis, comparisons, and corroborating simulations focus on robustifying linear regression, but succinct overview of other areas is provided to highlight universality of the novel framework.

[1]  Peter J. Rousseeuw,et al.  Robust Regression and Outlier Detection , 2005, Wiley Series in Probability and Statistics.

[2]  Trevor Hastie,et al.  Regularization Paths for Generalized Linear Models via Coordinate Descent. , 2010, Journal of statistical software.

[3]  Trevor Hastie,et al.  The Elements of Statistical Learning , 2001 .

[4]  M. Yuan,et al.  Model selection and estimation in regression with grouped variables , 2006 .

[5]  Bhaskar D. Rao,et al.  Algorithms for robust linear regression by exploiting the connection to sparse signal recovery , 2010, 2010 IEEE International Conference on Acoustics, Speech and Signal Processing.

[6]  Joel A. Tropp,et al.  Just relax: convex programming methods for identifying sparse signals in noise , 2006, IEEE Transactions on Information Theory.

[7]  Bernhard P. Wrobel,et al.  Multiple View Geometry in Computer Vision , 2001 .

[8]  Robert C. Bolles,et al.  Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography , 1981, CACM.

[9]  Jianqing Fan,et al.  Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties , 2001 .

[10]  Stephen P. Boyd,et al.  Enhancing Sparsity by Reweighted ℓ1 Minimization , 2007, 0711.1612.

[11]  Yonina C. Eldar,et al.  Collaborative hierarchical sparse modeling , 2010, 2010 44th Annual Conference on Information Sciences and Systems (CISS).

[12]  Jean-Jacques Fuchs,et al.  An inverse problem approach to robust regression , 1999, 1999 IEEE International Conference on Acoustics, Speech, and Signal Processing. Proceedings. ICASSP99 (Cat. No.99CH36258).

[13]  Georgios B. Giannakis,et al.  Robust layered sensing: From sparse signals to sparse residuals , 2010, 2010 Conference Record of the Forty Fourth Asilomar Conference on Signals, Systems and Computers.

[14]  B. Ripley,et al.  Robust Statistics , 2018, Encyclopedia of Mathematical Geosciences.

[15]  PETER J. ROUSSEEUW,et al.  Computing LTS Regression for Large Data Sets , 2005, Data Mining and Knowledge Discovery.

[16]  Andrew Zisserman,et al.  Multiple View Geometry in Computer Vision (2nd ed) , 2003 .

[17]  R. Tibshirani,et al.  Least angle regression , 2004, math/0406456.

[18]  Peter J. Huber,et al.  Robust Statistics , 2005, Wiley Series in Probability and Statistics.

[19]  Xu Peiliang,et al.  Overview of Total Least Squares Methods , 2013 .

[20]  Sabine Van Huffel,et al.  Overview of total least-squares methods , 2007, Signal Process..