IEEE TRANSACTIONS ON SIGNAL PROCESSING 1 Analysis of Sparse Regularization Based Robust Regression Approaches

Regression in the presence of outliers is an inherently combinatorial problem. However, compressive sensing theory suggests that certain combinatorial optimization problems can be exactly solved using polynomial-time algorithms. Motivated by this connection, several research groups have proposed polynomial-time algorithms for robust regression. In this paper we specifically address the traditional robust regression problem, where the number of observations is more than the number of unknown regression parameters and the structure of the regressor matrix is defined by the training dataset (and hence it may not satisfy properties such as Restricted Isometry Property or incoherence). We derive the precise conditions under which the sparse regularization (l0 and l1-norm) approaches solve the robust regression problem. We show that the smallest principal angle between the regressor subspace and all k-dimensional outlier subspaces is the fundamental quantity that determines the performance of these algorithms. In terms of this angle we provide an estimate of the number of outliers the sparse regularization based approaches can handle. We then empirically evaluate the sparse (l1-norm) regularization approach against other traditional robust regression algorithms to identify accurate and efficient algorithms for high-dimensional regression problems.

[1]  Michael A. Saunders,et al.  Atomic Decomposition by Basis Pursuit , 1998, SIAM J. Sci. Comput..

[2]  Emmanuel J. Candès,et al.  Highly Robust Error Correction byConvex Programming , 2006, IEEE Transactions on Information Theory.

[3]  V. Yohai,et al.  Robust Statistics: Theory and Methods , 2006 .

[4]  Peter J. Huber,et al.  Robust Statistics , 2005, Wiley Series in Probability and Statistics.

[5]  Bhaskar D. Rao,et al.  Algorithms for robust linear regression by exploiting the connection to sparse signal recovery , 2010, 2010 IEEE International Conference on Acoustics, Speech and Signal Processing.

[6]  Richard G. Baraniuk,et al.  Exact signal recovery from sparsely corrupted measurements through the Pursuit of Justice , 2009, 2009 Conference Record of the Forty-Third Asilomar Conference on Signals, Systems and Computers.

[7]  Andrew Zisserman,et al.  MLESAC: A New Robust Estimator with Application to Estimating Image Geometry , 2000, Comput. Vis. Image Underst..

[8]  Peter J. Rousseeuw,et al.  ROBUST REGRESSION BY MEANS OF S-ESTIMATORS , 1984 .

[9]  Peter J. Rousseeuw,et al.  Robust regression and outlier detection , 1987 .

[10]  Rama Chellappa,et al.  Robust regression using sparse learning for high dimensional parameter estimation problems , 2010, 2010 IEEE International Conference on Acoustics, Speech and Signal Processing.

[11]  Robert H. Halstead,et al.  Matrix Computations , 2011, Encyclopedia of Parallel Computing.

[12]  Kenneth E. Barner,et al.  Robust Sampling and Reconstruction Methods for Sparse Signals in the Presence of Impulsive Noise , 2010, IEEE Journal of Selected Topics in Signal Processing.

[13]  Bhaskar D. Rao,et al.  Sparse Bayesian learning for basis selection , 2004, IEEE Transactions on Signal Processing.

[14]  George Eastman House,et al.  Sparse Bayesian Learning and the Relevan e Ve tor Ma hine , 2001 .

[15]  S. Frick,et al.  Compressed Sensing , 2014, Computer Vision, A Reference Guide.

[16]  Emmanuel J. Candès,et al.  Decoding by linear programming , 2005, IEEE Transactions on Information Theory.

[17]  P. Rousseeuw Least Median of Squares Regression , 1984 .

[18]  Arian Maleki,et al.  Optimally Tuned Iterative Reconstruction Algorithms for Compressed Sensing , 2009, IEEE Journal of Selected Topics in Signal Processing.

[19]  E. Candès The restricted isometry property and its implications for compressed sensing , 2008 .

[20]  Richard G. Baraniuk,et al.  Stable Restoration and Separation of Approximately Sparse Signals , 2011, ArXiv.

[21]  Helmut Bölcskei,et al.  Recovery of Sparsely Corrupted Signals , 2011, IEEE Transactions on Information Theory.

[22]  V. Yohai HIGH BREAKDOWN-POINT AND HIGH EFFICIENCY ROBUST ESTIMATES FOR REGRESSION , 1987 .

[23]  Robert C. Bolles,et al.  Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography , 1981, CACM.

[24]  David L. Donoho,et al.  High-Dimensional Centrally Symmetric Polytopes with Neighborliness Proportional to Dimension , 2006, Discret. Comput. Geom..

[25]  Lisa Turner,et al.  Applications of Second Order Cone Programming , 2012 .

[26]  B. Ripley,et al.  Robust Statistics , 2018, Encyclopedia of Mathematical Geosciences.

[27]  Emmanuel J. Candès,et al.  Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information , 2004, IEEE Transactions on Information Theory.

[28]  Charles V. Stewart,et al.  Robust Parameter Estimation in Computer Vision , 1999, SIAM Rev..

[29]  D. Donoho,et al.  Counting faces of randomly-projected polytopes when the projection radically lowers dimension , 2006, math/0607364.

[30]  Peter Bajorski,et al.  Wiley Series in Probability and Statistics , 2010 .