Greedy Algorithms for Joint Sparse Recovery

Five known greedy algorithms designed for the single measurement vector setting in compressed sensing and sparse approximation are extended to the multiple measurement vector scenario: Iterative Hard Thresholding (IHT), Normalized IHT (NIHT), Hard Thresholding Pursuit (HTP), Normalized HTP (NHTP), and Compressive Sampling Matching Pursuit (CoSaMP). Using the asymmetric restricted isometry property (ARIP), sufficient conditions for all five algorithms establish bounds on the discrepancy between the algorithms' output and the optimal row-sparse representation. When the initial multiple measurement vectors are jointly sparse, ARIP-based guarantees for exact recovery are also established. The algorithms are then compared via the recovery phase transition framework. The strong phase transitions describing the family of Gaussian matrices which satisfy the sufficient conditions are obtained via known bounds on the ARIP constants. The algorithms' empirical weak phase transitions are compared for various numbers of multiple measurement vectors. Finally, the performance of the algorithms is compared against a known rank aware greedy algorithm, Rank Aware Simultaneous Orthogonal Matching Pursuit + MUSIC. Simultaneous recovery variants of NIHT, NHTP, and CoSaMP all outperform the rank-aware algorithm.

[1]  Arian Maleki,et al.  Optimally Tuned Iterative Reconstruction Algorithms for Compressed Sensing , 2009, IEEE Journal of Selected Topics in Signal Processing.

[2]  Mike E. Davies,et al.  Recovery Guarantees for Rank Aware Pursuits , 2012, IEEE Signal Processing Letters.

[3]  Jared Tanner,et al.  Explorer Compressed Sensing : How Sharp Is the Restricted Isometry Property ? , 2011 .

[4]  R. O. Schmidt,et al.  Multiple emitter location and signal Parameter estimation , 1986 .

[5]  Joel A. Tropp,et al.  ALGORITHMS FOR SIMULTANEOUS SPARSE APPROXIMATION , 2006 .

[6]  Yonina C. Eldar,et al.  Rank Awareness in Joint Sparse Recovery , 2010, IEEE Transactions on Information Theory.

[7]  Dany Leviatan,et al.  Simultaneous approximation by greedy algorithms , 2006, Adv. Comput. Math..

[8]  Michael P. Friedlander,et al.  Theoretical and Empirical Results for Recovery From Multiple Measurements , 2009, IEEE Transactions on Information Theory.

[9]  Yonina C. Eldar,et al.  Average Case Analysis of Multichannel Sparse Recovery Using Convex Relaxation , 2009, IEEE Transactions on Information Theory.

[10]  Deanna Needell,et al.  CoSaMP: Iterative signal recovery from incomplete and inaccurate samples , 2008, ArXiv.

[11]  Emmanuel J. Candès,et al.  Decoding by linear programming , 2005, IEEE Transactions on Information Theory.

[12]  Jared Tanner,et al.  GPU accelerated greedy algorithms for compressed sensing , 2013, Mathematical Programming Computation.

[13]  Jared Tanner,et al.  Performance comparisons of greedy algorithms in compressed sensing , 2015, Numer. Linear Algebra Appl..

[14]  V. Temlyakov A remark on simultaneous greedy approximation , 2004 .

[15]  Simon Foucart,et al.  RECOVERING JOINTLY SPARSE VECTORS VIA HARD THRESHOLDING PURSUIT , 2011 .

[16]  Mike E. Davies,et al.  Normalized Iterative Hard Thresholding: Guaranteed Stability and Performance , 2010, IEEE Journal of Selected Topics in Signal Processing.

[17]  Bhaskar D. Rao,et al.  Sparse solutions to linear inverse problems with multiple measurement vectors , 2005, IEEE Transactions on Signal Processing.

[18]  J. Tropp Algorithms for simultaneous sparse approximation. Part II: Convex relaxation , 2006, Signal Process..

[19]  Yoram Bresler,et al.  Subspace Methods for Joint Sparse Recovery , 2010, IEEE Transactions on Information Theory.

[20]  Joel A. Tropp,et al.  Algorithms for simultaneous sparse approximation. Part I: Greedy pursuit , 2006, Signal Process..

[21]  S. Foucart Sparse Recovery Algorithms: Sufficient Conditions in Terms of RestrictedIsometry Constants , 2012 .

[22]  Yonina C. Eldar,et al.  Robust Recovery of Signals From a Structured Union of Subspaces , 2008, IEEE Transactions on Information Theory.

[23]  Balas K. Natarajan,et al.  Sparse Approximate Solutions to Linear Systems , 1995, SIAM J. Comput..

[24]  Jie Chen,et al.  Theoretical Results on Sparse Representations of Multiple-Measurement Vectors , 2006, IEEE Transactions on Signal Processing.

[25]  David L. Donoho,et al.  Precise Undersampling Theorems , 2010, Proceedings of the IEEE.

[26]  Mike E. Davies,et al.  Iterative Hard Thresholding for Compressed Sensing , 2008, ArXiv.

[27]  Simon Foucart,et al.  Hard Thresholding Pursuit: An Algorithm for Compressive Sensing , 2011, SIAM J. Numer. Anal..

[28]  D. Donoho,et al.  Neighborliness of randomly projected simplices in high dimensions. , 2005, Proceedings of the National Academy of Sciences of the United States of America.

[29]  Jared Tanner,et al.  Improved Bounds on Restricted Isometry Constants for Gaussian Matrices , 2010, SIAM J. Matrix Anal. Appl..

[30]  Jared Tanner,et al.  Phase Transitions for Greedy Sparse Approximation Algorithms , 2010, ArXiv.

[31]  Vladimir N. Temlyakov,et al.  Vector greedy algorithms , 2003, J. Complex..

[32]  M. Lai,et al.  The null space property for sparse recovery from multiple measurement vectors , 2011 .

[33]  D. Donoho,et al.  Counting faces of randomly-projected polytopes when the projection radically lowers dimension , 2006, math/0607364.