Computationally efficient sparse bayesian learning via belief propagation

We propose a belief propagation (BP) based sparse Bayesian learning (SBL) algorithm, referred to as the BP-SBL, for sparse signal recovery in large scale compressed sensing problems. BP-SBL is based on a widely-used hierarchical Bayesian model. We convert this model to a factor graph and then apply BP to achieve computational efficiency. The computational complexity of BP-SBL is almost linear with respect to the number of transform coefficients, allowing the algorithms to deal with large scale compressed sensing problems efficiently. Numerical examples are provided to demonstrate the effectiveness of BP-SBL.

[1]  Michael E. Tipping Sparse Bayesian Learning and the Relevance Vector Machine , 2001, J. Mach. Learn. Res..

[2]  R. Tibshirani Regression Shrinkage and Selection via the Lasso , 1996 .

[3]  E.J. Candes,et al.  An Introduction To Compressive Sampling , 2008, IEEE Signal Processing Magazine.

[4]  Michael A. Saunders,et al.  Atomic Decomposition by Basis Pursuit , 1998, SIAM J. Sci. Comput..

[5]  Lawrence Carin,et al.  Bayesian Compressive Sensing , 2008, IEEE Transactions on Signal Processing.

[6]  Bhaskar D. Rao,et al.  Sparse Bayesian learning for basis selection , 2004, IEEE Transactions on Signal Processing.

[7]  H.-A. Loeliger,et al.  An introduction to factor graphs , 2004, IEEE Signal Process. Mag..

[8]  R.G. Baraniuk,et al.  Compressive Sensing [Lecture Notes] , 2007, IEEE Signal Processing Magazine.