Analysis of Sparse Regularization Based Robust Regression Approaches

Regression in the presence of outliers is an inherently combinatorial problem. However, compressive sensing theory suggests that certain combinatorial optimization problems can be exactly solved using polynomial-time algorithms. Motivated by this connection, several research groups have proposed polynomial-time algorithms for robust regression. In this paper we specifically address the traditional robust regression problem, where the number of observations is more than the number of unknown regression parameters and the structure of the regressor matrix is defined by the training dataset (and hence it may not satisfy properties such as Restricted Isometry Property or incoherence). We derive the precise conditions under which the sparse regularization ( and -norm) approaches solve the robust regression problem. We show that the smallest principal angle between the regressor subspace and all -dimensional outlier subspaces is the fundamental quantity that determines the performance of these algorithms. In terms of this angle we provide an estimate of the number of outliers the sparse regularization based approaches can handle. We then empirically evaluate the sparse ( -norm) regularization approach against other traditional robust regression algorithms to identify accurate and efficient algorithms for high-dimensional regression problems.

[1]  Peter Bajorski,et al.  Wiley Series in Probability and Statistics , 2010 .

[2]  Richard G. Baraniuk,et al.  Stable Restoration and Separation of Approximately Sparse Signals , 2011, ArXiv.

[3]  Helmut Bölcskei,et al.  Recovery of Sparsely Corrupted Signals , 2011, IEEE Transactions on Information Theory.

[4]  Bhaskar D. Rao,et al.  Algorithms for robust linear regression by exploiting the connection to sparse signal recovery , 2010, 2010 IEEE International Conference on Acoustics, Speech and Signal Processing.

[5]  Rama Chellappa,et al.  Robust regression using sparse learning for high dimensional parameter estimation problems , 2010, 2010 IEEE International Conference on Acoustics, Speech and Signal Processing.

[6]  Kenneth E. Barner,et al.  Robust Sampling and Reconstruction Methods for Sparse Signals in the Presence of Impulsive Noise , 2010, IEEE Journal of Selected Topics in Signal Processing.

[7]  Richard G. Baraniuk,et al.  Exact signal recovery from sparsely corrupted measurements through the Pursuit of Justice , 2009, 2009 Conference Record of the Forty-Third Asilomar Conference on Signals, Systems and Computers.

[8]  Arian Maleki,et al.  Optimally Tuned Iterative Reconstruction Algorithms for Compressed Sensing , 2009, IEEE Journal of Selected Topics in Signal Processing.

[9]  E. Candès The restricted isometry property and its implications for compressed sensing , 2008 .

[10]  Emmanuel J. Candès,et al.  Highly Robust Error Correction byConvex Programming , 2006, IEEE Transactions on Information Theory.

[11]  D. Donoho,et al.  Counting faces of randomly-projected polytopes when the projection radically lowers dimension , 2006, math/0607364.

[12]  Emmanuel J. Candès,et al.  Decoding by linear programming , 2005, IEEE Transactions on Information Theory.

[13]  E. Candès,et al.  Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information , 2004, IEEE Transactions on Information Theory.

[14]  Bhaskar D. Rao,et al.  Sparse Bayesian learning for basis selection , 2004, IEEE Transactions on Signal Processing.

[15]  Andrew Zisserman,et al.  MLESAC: A New Robust Estimator with Application to Estimating Image Geometry , 2000, Comput. Vis. Image Underst..

[16]  Charles V. Stewart,et al.  Robust Parameter Estimation in Computer Vision , 1999, SIAM Rev..

[17]  Michael A. Saunders,et al.  Atomic Decomposition by Basis Pursuit , 1998, SIAM J. Sci. Comput..

[18]  V. Yohai HIGH BREAKDOWN-POINT AND HIGH EFFICIENCY ROBUST ESTIMATES FOR REGRESSION , 1987 .

[19]  P. Rousseeuw Least Median of Squares Regression , 1984 .

[20]  Robert C. Bolles,et al.  Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography , 1981, CACM.

[21]  Lisa Turner,et al.  Applications of Second Order Cone Programming , 2012 .

[22]  Gitta Kutyniok Compressed Sensing , 2012 .

[23]  Robert H. Halstead,et al.  Matrix Computations , 2011, Encyclopedia of Parallel Computing.

[24]  David L. Donoho,et al.  High-Dimensional Centrally Symmetric Polytopes with Neighborliness Proportional to Dimension , 2006, Discret. Comput. Geom..

[25]  George Eastman House,et al.  Sparse Bayesian Learning and the Relevan e Ve tor Ma hine , 2001 .

[26]  Peter J. Rousseeuw,et al.  ROBUST REGRESSION BY MEANS OF S-ESTIMATORS , 1984 .