Low Complexity Static and Dynamic Sparse Bayesian Learning Combining BP, VB and EP Message Passing

Sparse Bayesian Learning (SBL) provides sophisticated (state) model order selection with unknown support distribution. This allows to handle problems with big state dimensions and relatively limited data by exploiting variations in parameter importance. The techniques proposed in this paper allow to handle the extension of SBL to time-varying states, modeled as diagonal first-order auto-regressive (DAR(1)) processes with unknown parameters to be estimated also. Adding the parameters to the state leads to an augmented state and a non-linear (at least bilinear) state-space model. The proposed approach, which applies also to more general non-linear models, uses a combination of belief propagation (BP), Variational Bayes (VB) or mean field (MF) techniques, and Expectation Propagation (EP) to approximate the posterior marginal distributions of the scalar factors. We propose Fisher Information Matrix analysis to determine the variable split between the use of BP and VB allowing to stay optimal in terms of Laplace approximation.

[1]  Fulvio Gini,et al.  Performance Bounds for Parameter Estimation under Misspecified Models: Fundamental Findings and Applications , 2017, IEEE Signal Processing Magazine.

[2]  Dirk T. M. Slock,et al.  SAVE - SPACE ALTERNATING VARIATIONAL ESTIMATION FOR SPARSE BAYESIAN LEARNING , 2018, 2018 IEEE Data Science Workshop (DSW).

[3]  A. Gelfand,et al.  Identifiability, Improper Priors, and Gibbs Sampling for Generalized Linear Models , 1999 .

[4]  Sophia-Antipolis SPACE ALTERNATING VARIATIONAL ESTIMATION AND KRONECKER STRUCTURED DICTIONARY LEARNING , 2019 .

[5]  Jian Li,et al.  Computationally efficient sparse bayesian learning via belief propagation , 2009 .

[6]  Bhaskar D. Rao,et al.  Type I and Type II Bayesian Methods for Sparse Signal Recovery Using Scale Mixtures , 2015, IEEE Transactions on Signal Processing.

[7]  Dirk T. M. Slock,et al.  Bayesian adaptive filtering: Principles and practical approaches , 2004, 2004 12th European Signal Processing Conference.

[8]  Bhaskar D. Rao,et al.  Sparse Signal Recovery With Temporally Correlated Source Vectors Using Sparse Bayesian Learning , 2011, IEEE Journal of Selected Topics in Signal Processing.

[9]  Erwin Riegler,et al.  Merging Belief Propagation and the Mean Field Approximation: A Free Energy Approach , 2010, IEEE Transactions on Information Theory.

[10]  Bhaskar D. Rao,et al.  Sparse Bayesian learning for basis selection , 2004, IEEE Transactions on Signal Processing.

[11]  Sundeep Rangan,et al.  Generalized approximate message passing for estimation with random linear mixing , 2010, 2011 IEEE International Symposium on Information Theory Proceedings.

[12]  Carlos H. Muravchik,et al.  Posterior Cramer-Rao bounds for discrete-time nonlinear filtering , 1998, IEEE Trans. Signal Process..

[13]  Andrea Montanari,et al.  Message-passing algorithms for compressed sensing , 2009, Proceedings of the National Academy of Sciences.

[14]  Dirk T. M. Slock,et al.  Space Alternating Variational Bayesian Learning for LMMSE Filtering , 2018, 2018 26th European Signal Processing Conference (EUSIPCO).

[15]  Klaus I. Pedersen,et al.  Channel parameter estimation in mobile radio environments using the SAGE algorithm , 1999, IEEE J. Sel. Areas Commun..

[16]  Christo Kurisummoottil Thomas Gaussian variational Bayes Kalman filtering for dynamic sparse Bayesian learning , 2018 .

[17]  Tom Minka,et al.  A family of algorithms for approximate Bayesian inference , 2001 .

[18]  George Eastman House,et al.  Sparse Bayesian Learning and the Relevan e Ve tor Ma hine , 2001 .

[19]  William T. Freeman,et al.  Constructing free-energy approximations and generalized belief propagation algorithms , 2005, IEEE Transactions on Information Theory.