Simultaneously Sparse and Low-Rank Matrix Reconstruction via Nonconvex and Nonseparable Regularization

Many real-world problems involve the recovery of a matrix from linear measurements, where the matrix lies close to some low-dimensional structure. This paper considers the problem of reconstructing a matrix with a simultaneously sparse and low-rank model. As surrogate functions of the sparsity and the matrix rank that are non-convex and discontinuous, the $\ell _1$ norm and the nuclear norm are often used instead to derive efficient algorithms to promote sparse and low-rank characteristics, respectively. However, the $\ell _1$ norm and the nuclear norm are loose approximations, and furthermore, recent study reveals using convex regularizations for joint structures cannot do better, orderwise, than exploiting only one of the structures. Motivated by the construction of non-convex and nonseparable regularization in sparse Bayesian learning, a new optimization problem is formulated in the latent space for recovering a simultaneously sparse and low-rank matrix. The newly proposed non-convex cost function is proved to have the ability to recover a simultaneously sparse and low-rank matrix with a sufficient number of noiseless linear measurements. In addition, an algorithm is derived to solve the resulting non-convex optimization problem, and convergence analysis of the proposed algorithm is provided in this paper. The performance of the proposed approach is demonstrated by experiments using both synthetic data and real hyperspectral images for compressive sensing applications.

[1]  Henry Arguello,et al.  Simultaneously sparse and low-rank hyperspectral image recovery from coded aperture compressive measurements via convex optimization , 2016, SPIE Defense + Security.

[2]  Ivan W. Selesnick,et al.  Enhanced Sparsity by Non-Separable Regularization , 2015, IEEE Transactions on Signal Processing.

[3]  Yoshiyuki Kabashima,et al.  Erratum: A typical reconstruction limit of compressed sensing based on Lp-norm minimization , 2009, ArXiv.

[4]  Ivan W. Selesnick,et al.  Improved sparse low-rank matrix estimation , 2016, Signal Process..

[5]  Emmanuel J. Candès,et al.  Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information , 2004, IEEE Transactions on Information Theory.

[6]  Bhaskar D. Rao,et al.  Latent Variable Bayesian Models for Promoting Sparsity , 2011, IEEE Transactions on Information Theory.

[7]  Jean-Philippe Vert,et al.  Tight convex relaxations for sparse matrix factorization , 2014, NIPS.

[8]  Richard G. Baraniuk,et al.  Sparsity and Structure in Hyperspectral Imaging : Sensing, Reconstruction, and Target Detection , 2014, IEEE Signal Processing Magazine.

[9]  Wen Gao,et al.  Exploring Algorithmic Limits of Matrix Rank Minimization Under Affine Constraints , 2014, IEEE Transactions on Signal Processing.

[10]  Inderjit S. Dhillon,et al.  Matrix Completion with Noisy Side Information , 2015, NIPS.

[11]  David L Donoho,et al.  Compressed sensing , 2006, IEEE Transactions on Information Theory.

[12]  Paris V. Giampouras,et al.  Simultaneously Sparse and Low-Rank Abundance Matrix Estimation for Hyperspectral Image Unmixing , 2015, IEEE Transactions on Geoscience and Remote Sensing.

[13]  Dan Yang,et al.  Rate Optimal Denoising of Simultaneously Sparse and Low Rank Matrices , 2014, J. Mach. Learn. Res..

[14]  Yonina C. Eldar,et al.  Sparsity Based Sub-wavelength Imaging with Partially Incoherent Light via Quadratic Compressed Sensing References and Links , 2022 .

[15]  Ivan W. Selesnick,et al.  Sparse Signal Approximation via Nonseparable Regularization , 2017, IEEE Transactions on Signal Processing.

[16]  Christos G. Tsinos,et al.  Distributed Blind Hyperspectral Unmixing via Joint Sparsity and Low-Rank Constrained Non-Negative Matrix Factorization , 2017, IEEE Transactions on Computational Imaging.

[17]  Yi Ma,et al.  Robust principal component analysis? , 2009, JACM.

[18]  Angshul Majumdar,et al.  Split Bregman algorithms for sparse / joint-sparse and low-rank signal recovery: Application in compressive hyperspectral imaging , 2014, 2014 IEEE International Conference on Image Processing (ICIP).

[19]  Emmanuel J. Candès,et al.  Exact Matrix Completion via Convex Optimization , 2009, Found. Comput. Math..

[20]  Maryam Fazel,et al.  Iterative reweighted algorithms for matrix rank minimization , 2012, J. Mach. Learn. Res..

[21]  Li Zhang,et al.  Joint Low-Rank and Sparse Principal Feature Coding for Enhanced Robust Representation and Visual Classification , 2016, IEEE Transactions on Image Processing.

[22]  Gonzalo Mateos,et al.  Robust PCA as Bilinear Decomposition With Outlier-Sparsity Regularization , 2011, IEEE Transactions on Signal Processing.

[23]  Bhaskar D. Rao,et al.  An affine scaling methodology for best basis selection , 1999, IEEE Trans. Signal Process..

[24]  Yonina C. Eldar,et al.  Simultaneously Structured Models With Application to Sparse and Low-Rank Matrices , 2012, IEEE Transactions on Information Theory.

[25]  George Eastman House,et al.  Sparse Bayesian Learning and the Relevance Vector Machine , 2001 .

[26]  Bhaskar D. Rao,et al.  Sparse Bayesian learning for basis selection , 2004, IEEE Transactions on Signal Processing.

[27]  Mike E. Davies,et al.  Sampling Theorems for Signals From the Union of Finite-Dimensional Linear Subspaces , 2009, IEEE Transactions on Information Theory.

[28]  Nicolas Vayatis,et al.  Estimation of Simultaneously Sparse and Low Rank Matrices , 2012, ICML.

[29]  Pierre Vandergheynst,et al.  Hyperspectral image compressed sensing via low-rank and joint-sparse matrix recovery , 2012, 2012 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[30]  Angshul Majumdar,et al.  Hyperspectral impulse denoising with sparse and low-rank penalties , 2015, 2015 7th Workshop on Hyperspectral Image and Signal Processing: Evolution in Remote Sensing (WHISPERS).