Hybrid solver for hierarchical Bayesian inverse problems

The recovery of sparse generative models from few noisy measurements is an important and challenging problem. Many deterministic algorithms rely on some form of $\ell_1$-$\ell_2$ minimization to combine the computational convenience of the $\ell_2$ penalty and the sparsity promotion of the $\ell_1$. It was recently shown within the Bayesian framework that sparsity promotion and computational efficiency can be attained with hierarchical models with conditionally Gaussian priors and gamma hyperpriors. The related Gibbs energy function is a convex functional and its minimizer, which is the MAP estimate of the posterior, can be computed efficiently with the globally convergent Iterated Alternating Sequential (IAS) algorithm \cite{CSS}. Generalization of the hyperpriors for these sparsity promoting hierarchical models to generalized gamma family yield either globally convex Gibbs energy functionals, or can exhibit local convexity for some choices for the hyperparameters. \cite{CPrSS}. The main problem in computing the MAP solution for greedy hyperpriors that strongly promote sparsity is the presence of local minima. To overcome the premature stopping at a spurious local minimizer, we propose two hybrid algorithms that first exploit the global convergence associated with gamma hyperpriors to arrive in a neighborhood of the unique minimizer, then adopt a generalized gamma hyperprior that promote sparsity more strongly. The performance of the two algorithms is illustrated with computed examples.

[1]  D. Calvetti,et al.  Sparse reconstructions from few noisy data: analysis of hierarchical Bayesian models with generalized gamma hyperpriors , 2020, Inverse Problems.

[2]  Barbara Vantaggi,et al.  Bayes Meets Krylov: Statistically Inspired Preconditioners for CGLS , 2018, SIAM Rev..

[3]  Faming Liang,et al.  Statistical and Computational Inverse Problems , 2006, Technometrics.

[4]  Daniela Calvetti,et al.  Introduction to Bayesian Scientific Computing: Ten Lectures on Subjective Computing , 2007 .

[5]  D. Oldenburg,et al.  3-D inversion of gravity data , 1998 .

[6]  Ronald A. DeVore,et al.  Image compression through wavelet transform coding , 1992, IEEE Trans. Inf. Theory.

[7]  Stephen P. Boyd,et al.  Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers , 2011, Found. Trends Mach. Learn..

[8]  Daniela Calvetti,et al.  A Gaussian hypermodel to recover blocky objects , 2007 .

[9]  Barbara Vantaggi,et al.  Priorconditioned CGLS-Based Quasi-MAP Estimate, Statistical Stopping Rule, and Ranking of Priors , 2017, SIAM J. Sci. Comput..

[10]  Guillermo Sapiro,et al.  Online dictionary learning for sparse coding , 2009, ICML '09.

[11]  Bhaskar D. Rao,et al.  Sparse signal reconstruction from limited data using FOCUSS: a re-weighted minimum norm algorithm , 1997, IEEE Trans. Signal Process..

[12]  Daniela Calvetti,et al.  Inverse problems: From regularization to Bayesian inference , 2018 .

[13]  Barbara Vantaggi,et al.  A hierarchical Krylov–Bayes iterative inverse solver for MEG with physiological preconditioning , 2015 .

[14]  Joseph F. Murray,et al.  Dictionary Learning Algorithms for Sparse Representation , 2003, Neural Computation.

[15]  D Calvetti,et al.  Hierachical Bayesian models and sparsity: ℓ2-magic , 2019, Inverse Problems.

[16]  Harri Hakula,et al.  Conditionally Gaussian Hypermodels for Cerebral Source Localization , 2008, SIAM J. Imaging Sci..

[17]  Emmanuel J. Candès,et al.  Decoding by linear programming , 2005, IEEE Transactions on Information Theory.

[18]  E. Candès,et al.  Stable signal recovery from incomplete and inaccurate measurements , 2005, math/0503066.

[19]  Douglas W. Oldenburg,et al.  3-D inversion of magnetic data , 1996 .

[20]  Seppo P. Ahlfors,et al.  Assessing and improving the spatial accuracy in MEG source localization by depth-weighted minimum-norm estimates , 2006, NeuroImage.

[21]  I. Daubechies,et al.  Iteratively reweighted least squares minimization for sparse recovery , 2008, 0807.0575.

[22]  Michael Elad,et al.  From Sparse Solutions of Systems of Equations to Sparse Modeling of Signals and Images , 2009, SIAM Rev..

[23]  Michael Elad,et al.  Stable recovery of sparse overcomplete representations in the presence of noise , 2006, IEEE Transactions on Information Theory.

[24]  Mayank Vatsa,et al.  Deep Dictionary Learning , 2016, IEEE Access.

[25]  D. Donoho For most large underdetermined systems of linear equations the minimal 𝓁1‐norm solution is also the sparsest solution , 2006 .