Modal regression based greedy algorithm for robust sparse signal recovery, clustering and classification

Abstract Greedy algorithm (GA) is an efficient sparse representation framework with numerous applications in machine learning and computer vision. However, conventional GA methods may fail when applied to grossly corrupted data because they iteratively estimate the sparse signal using least squares regression, which is sensitive to gross corruption and outliers. In this paper, we propose a modal regression based greedy algorithm referred as MROMP (modal regression based orthogonal matching pursuit) to robustly learn the sparse signal from corrupted measurements. Unlike previous GA methods, MROMP is based on sparse modal regression, which has decent robustness to heavy-tailed noise and outliers. To efficiently optimize MROMP, we devise two half-quadratic based algorithms with guaranteed convergence. Our another two contributions are leveraging MROMP to develop a robust subspace clustering method to cluster data lying in a union of subspaces, and a robust pattern classification method to recognize data into the class that they belong to, respectively. The experimental results on both simulated and real datasets demonstrate the efficacy and robustness of MROMP for sparse signal recovery, data clustering and classification, especially for grossly corrupted data.

[1]  Rajat Raina,et al.  Efficient sparse coding algorithms , 2006, NIPS.

[2]  Jonathan J. Hull,et al.  A Database for Handwritten Text Recognition Research , 1994, IEEE Trans. Pattern Anal. Mach. Intell..

[3]  Michael I. Jordan,et al.  On Spectral Clustering: Analysis and an algorithm , 2001, NIPS.

[4]  J. Tropp,et al.  CoSaMP: Iterative signal recovery from incomplete and inaccurate samples , 2008, Commun. ACM.

[5]  Weifeng Liu,et al.  Correntropy: Properties and Applications in Non-Gaussian Signal Processing , 2007, IEEE Transactions on Signal Processing.

[6]  Daniel P. Robinson,et al.  Scalable Sparse Subspace Clustering by Orthogonal Matching Pursuit , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[7]  Xuelong Li,et al.  Robust Semi-Supervised Subspace Clustering via Non-Negative Low-Rank Representation , 2016, IEEE Transactions on Cybernetics.

[8]  Mila Nikolova,et al.  Analysis of Half-Quadratic Minimization Methods for Signal and Image Recovery , 2005, SIAM J. Sci. Comput..

[9]  Terence Sim,et al.  The CMU Pose, Illumination, and Expression Database , 2003, IEEE Trans. Pattern Anal. Mach. Intell..

[10]  David J. Kriegman,et al.  Acquiring linear subspaces for face recognition under variable lighting , 2005, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[11]  Mohammed Bennamoun,et al.  Linear Regression for Face Recognition , 2010, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[12]  Deyu Meng,et al.  Robust Matrix Factorization with Unknown Noise , 2013, 2013 IEEE International Conference on Computer Vision.

[13]  T. Sager,et al.  Maximum Likelihood Estimation of Isotonic Modal Regression , 1982 .

[14]  René Vidal,et al.  Sparse Subspace Clustering: Algorithm, Theory, and Applications , 2012, IEEE transactions on pattern analysis and machine intelligence.

[15]  Lan Tang,et al.  Group-based sparse representation for image compressive sensing reconstruction with non-convex regularization , 2017, Neurocomputing.

[16]  Shuicheng Yan,et al.  Robust and Efficient Subspace Segmentation via Least Squares Regression , 2012, ECCV.

[17]  D. Donoho For most large underdetermined systems of linear equations the minimal 𝓁1‐norm solution is also the sparsest solution , 2006 .

[18]  Svetha Venkatesh,et al.  Efficient Algorithms for Robust Recovery of Images From Compressed Data , 2012, IEEE Transactions on Image Processing.

[19]  R. Tibshirani Regression Shrinkage and Selection via the Lasso , 1996 .

[20]  Ronen Basri,et al.  Lambertian Reflectance and Linear Subspaces , 2003, IEEE Trans. Pattern Anal. Mach. Intell..

[21]  Haiming Zhou,et al.  Nonparametric modal regression in the presence of measurement error , 2016, 1610.08860.

[22]  Deanna Needell,et al.  Signal Recovery From Incomplete and Inaccurate Measurements Via Regularized Orthogonal Matching Pursuit , 2007, IEEE Journal of Selected Topics in Signal Processing.

[23]  Yong Yu,et al.  Robust Recovery of Subspace Structures by Low-Rank Representation , 2010, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[24]  Yuan Yan Tang,et al.  Minimum Error Entropy Based Sparse Representation for Robust Subspace Clustering , 2015, IEEE Transactions on Signal Processing.

[25]  Jian Yang,et al.  A New Discriminative Sparse Representation Method for Robust Face Recognition via $l_{2}$ Regularization , 2017, IEEE Transactions on Neural Networks and Learning Systems.

[26]  Junfeng Yang,et al.  Alternating Direction Algorithms for 1-Problems in Compressive Sensing , 2009, SIAM J. Sci. Comput..

[27]  E. Parzen On Estimation of a Probability Density Function and Mode , 1962 .

[28]  Allen Y. Yang,et al.  Robust Face Recognition via Sparse Representation , 2009, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[29]  Jian Wang,et al.  Generalized Orthogonal Matching Pursuit , 2011, IEEE Transactions on Signal Processing.

[30]  Emmanuel J. Candès,et al.  Robust Subspace Clustering , 2013, ArXiv.

[31]  Lei Zhang,et al.  Sparse representation or collaborative representation: Which helps face recognition? , 2011, 2011 International Conference on Computer Vision.

[32]  Visa Koivunen,et al.  Robust greedy algorithms for compressed sensing , 2012, 2012 Proceedings of the 20th European Signal Processing Conference (EUSIPCO).

[33]  R. Tibshirani,et al.  Least angle regression , 2004, math/0406456.

[34]  Joel A. Tropp,et al.  Greed is good: algorithmic results for sparse approximation , 2004, IEEE Transactions on Information Theory.

[35]  Lei Zhang,et al.  A Probabilistic Collaborative Representation Based Approach for Pattern Classification , 2016, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).