Details on AMP-B-SBL: An Algorithm for Recovery of Clustered Sparse Signals Using Approximate Message Passing [1-3]

Solving the inverse problem of compressive sensing in the context of single measurement vector (SMV) problem with an unknown block-sparsity structure is considered. For this purpose, we propose a sparse Bayesian learning (SBL) algorithm simplified via the approximate message passing (AMP) framework. In order to encourage the block-sparsity structure, we incorporate the concept of total variation, called Sigma-Delta, as a measure of blocksparsity on the support set of the solution. The AMP framework reduces the computational load of the proposed SBL algorithm and as a result makes it faster compared to the message passing framework. Furthermore, in terms of the mean-squared error between the true and the reconstructed solution, the algorithm demonstrates an encouraging improvement compared to the other algorithms. Keywords— Compressed sensing (CS), sparse Bayesian learning (SBL), approximate message passing (AMP), clustered sparsity, single measurement vector (SMV).

[1]  Todd K. Moon,et al.  On the block-sparse solution of single measurement vectors , 2015, 2015 49th Asilomar Conference on Signals, Systems and Computers.

[2]  Bhaskar D. Rao,et al.  On the benefits of the block-sparsity structure in sparse signal recovery , 2012, 2012 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[3]  Bhaskar D. Rao,et al.  Extension of SBL Algorithms for the Recovery of Block Sparse Signals With Intra-Block Correlation , 2012, IEEE Transactions on Signal Processing.

[4]  Bhaskar D. Rao,et al.  Recovery of block sparse signals using the framework of block sparse Bayesian learning , 2012, 2012 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[5]  Hong Sun,et al.  Bayesian compressive sensing for cluster structured sparse signals , 2012, Signal Process..

[6]  Philip Schniter,et al.  Turbo reconstruction of structured sparse signals , 2010, 2010 44th Annual Conference on Information Sciences and Systems (CISS).

[7]  Y. C. Pati,et al.  Orthogonal matching pursuit: recursive function approximation with applications to wavelet decomposition , 1993, Proceedings of 27th Asilomar Conference on Signals, Systems and Computers.

[8]  Philip Schniter,et al.  Dynamic Compressive Sensing of Time-Varying Signals Via Approximate Message Passing , 2012, IEEE Transactions on Signal Processing.

[9]  Yonina C. Eldar,et al.  Reduce and Boost: Recovering Arbitrary Sets of Jointly Sparse Vectors , 2008, IEEE Transactions on Signal Processing.

[10]  D. Donoho,et al.  Basis pursuit , 1994, Proceedings of 1994 28th Asilomar Conference on Signals, Systems and Computers.

[11]  Michael Elad,et al.  A Plurality of Sparse Representations Is Better Than the Sparsest One Alone , 2009, IEEE Transactions on Information Theory.

[12]  Andrea Montanari,et al.  Message-passing algorithms for compressed sensing , 2009, Proceedings of the National Academy of Sciences.

[13]  Todd K. Moon,et al.  On the block-sparsity of multiple-measurement vectors , 2015, 2015 IEEE Signal Processing and Signal Processing Education Workshop (SP/SPE).

[14]  Mohammad Shekaramiz,et al.  Sparse Signal Recovery Based on Compressive Sensing and Exploration Using Multiple Mobile Sensors , 2018 .

[15]  Bhaskar D. Rao,et al.  Sparse solutions to linear inverse problems with multiple measurement vectors , 2005, IEEE Transactions on Signal Processing.

[16]  Todd K. Moon,et al.  AMP-B-SBL: An algorithm for clustered sparse signals using approximate message passing , 2016, 2016 IEEE 7th Annual Ubiquitous Computing, Electronics & Mobile Communication Conference (UEMCON).

[17]  Todd K. Moon,et al.  Hierarchical Bayesian approach for jointly-sparse solution of multiple-measurement vectors , 2014, 2014 48th Asilomar Conference on Signals, Systems and Computers.

[18]  Andrea Montanari,et al.  Message passing algorithms for compressed sensing: I. motivation and construction , 2009, 2010 IEEE Information Theory Workshop on Information Theory (ITW 2010, Cairo).