Two-stage convex relaxation approach to low-rank and sparsity regularized least squares loss

In this paper we consider the rank and zero norm regularized least squares loss minimization problem with a spectral norm ball constraint. For this class of NP-hard optimization problems, we propose a two-stage convex relaxation approach by majorizing some suitable locally Lipschitz continuous surrogates. Furthermore, the Frobenius norm error bound for the optimal solution of each stage is characterized and the theoretical guarantee is established for the two-stage convex relaxation approach by showing that the error bound of the first stage convex relaxation (i.e., the nuclear norm and $$\ell _1$$ℓ1-norm regularized minimization problem), can be reduced much by the second stage convex relaxation under a suitable restricted eigenvalue condition. Also, we verify the efficiency of the proposed approach by applying it to some random test problems and some real problems.

[1]  John Wright,et al.  Compressive principal component pursuit , 2012, 2012 IEEE International Symposium on Information Theory Proceedings.

[2]  A. Willsky,et al.  Latent variable graphical model selection via convex optimization , 2010, 1008.1290.

[3]  Michael K. Ng,et al.  Median filtering‐based methods for static background extraction from surveillance video , 2015, Numer. Linear Algebra Appl..

[4]  John Wright,et al.  RASL: Robust Alignment by Sparse and Low-Rank Decomposition for Linearly Correlated Images , 2012, IEEE Trans. Pattern Anal. Mach. Intell..

[5]  Aswin C. Sankaranarayanan,et al.  SpaRCS: Recovering low-rank and sparse matrices from compressive measurements , 2011, NIPS.

[6]  Yi Ma,et al.  TILT: Transform Invariant Low-Rank Textures , 2010, ACCV 2010.

[7]  Pierre Vandergheynst,et al.  Hyperspectral image compressed sensing via low-rank and joint-sparse matrix recovery , 2012, 2012 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[8]  Marc Teboulle,et al.  A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems , 2009, SIAM J. Imaging Sci..

[9]  Sham M. Kakade,et al.  Robust Matrix Decomposition With Sparse Corruptions , 2011, IEEE Transactions on Information Theory.

[10]  Wotao Yin,et al.  Improved Iteratively Reweighted Least Squares for Unconstrained Smoothed 퓁q Minimization , 2013, SIAM J. Numer. Anal..

[11]  Rick Chartrand,et al.  Nonconvex Splitting for Regularized Low-Rank + Sparse Decomposition , 2012, IEEE Transactions on Signal Processing.

[12]  Miao Weimin,et al.  MATRIX COMPLETION MODELS WITH FIXED BASIS COEFFICIENTS AND RANK REGULARIZED PROBLEMS WITH HARD CONSTRAINTS , 2013 .

[13]  Martin J. Wainwright,et al.  Noisy matrix decomposition via convex relaxation: Optimal rates in high dimensions , 2011, ICML.

[14]  Shiqian Ma,et al.  Efficient algorithms for robust and stable principal component pursuit problems , 2013, Comput. Optim. Appl..

[15]  Victor Vianu,et al.  Invited articles section foreword , 2010, JACM.

[16]  Xiaojun Chen,et al.  Alternating Direction Method of Multipliers for a Class of Nonconvex and Nonsmooth Problems with Applications to Background/Foreground Extraction , 2015, SIAM J. Imaging Sci..

[17]  Shaohua Pan,et al.  Multistage Convex Relaxation Approach to Rank Regularized Minimization Problems Based on Equivalent Mathematical Program with a Generalized Complementarity Constraint , 2017, SIAM J. Control. Optim..

[18]  Xiaoming Yuan,et al.  Recovering Low-Rank and Sparse Components of Matrices from Incomplete and Noisy Observations , 2011, SIAM J. Optim..

[19]  Defeng Sun,et al.  A rank-corrected procedure for matrix completion with fixed basis coefficients , 2012, Math. Program..

[20]  J. Tropp,et al.  Two proposals for robust PCA using semidefinite programming , 2010, 1012.1086.

[21]  Naihua Xiu,et al.  Exact Low-Rank Matrix Recovery via Nonconvex Schatten P-Minimization , 2013, Asia Pac. J. Oper. Res..

[22]  Xiaodong Li,et al.  Stable Principal Component Pursuit , 2010, 2010 IEEE International Symposium on Information Theory.

[23]  Guangming Shi,et al.  Nonlocal Sparse and Low-Rank Regularization for Optical Flow Estimation , 2014, IEEE Transactions on Image Processing.

[24]  Yi Ma,et al.  Robust principal component analysis? , 2009, JACM.

[25]  Shaohua Pan,et al.  Two-stage convex relaxation approach to least squares loss constrained low-rank plus sparsity optimization problems , 2015, Computational Optimization and Applications.

[26]  Gerard L. G. Sleijpen,et al.  Flexible and multi-shift induced dimension reduction algorithms for solving large sparse linear systems , 2015, Numer. Linear Algebra Appl..

[27]  A. Lewis The Convex Analysis of Unitarily Invariant Matrix Functions , 1995 .

[28]  Tieniu Tan,et al.  Recovery of corrupted low-rank matrices via half-quadratic based nonconvex minimization , 2011, CVPR 2011.

[29]  John Wright,et al.  Toward Guaranteed Illumination Models for Non-convex Objects , 2013, 2013 IEEE International Conference on Computer Vision.

[30]  Tong Zhang Some sharp performance bounds for least squares regression with L1 regularization , 2009, 0908.2869.

[31]  Pablo A. Parrilo,et al.  Rank-Sparsity Incoherence for Matrix Decomposition , 2009, SIAM J. Optim..

[32]  Narendra Ahuja,et al.  Imaging via three-dimensional compressive sampling (3DCS) , 2011, 2011 International Conference on Computer Vision.