Variational free energies for compressed sensing

We consider a variational free energy approach for compressed sensing. We first show that the naïve mean field approach performs remarkably well when coupled with a noise learning procedure. We also notice that it leads to the same equations as those used for iterative thresholding.We then discuss the Bethe free energy and how it corresponds to the fixed points of the approximate message passing algorithm. In both cases, we test numerically the direct optimization of the free energies as a converging sparse-estimation algorithm. We further derive the Bethe free energy in the context of generalized approximate message passing.

[1]  Devavrat Shah,et al.  Counting Independent Sets Using the Bethe Approximation , 2011, SIAM J. Discret. Math..

[2]  Florent Krzakala,et al.  Probabilistic reconstruction in compressed sensing: algorithms, phase diagrams, and threshold achieving matrices , 2012, ArXiv.

[3]  William T. Freeman,et al.  Understanding belief propagation and its generalizations , 2003 .

[4]  William T. Freeman,et al.  Constructing free-energy approximations and generalized belief propagation algorithms , 2005, IEEE Transactions on Information Theory.

[5]  Volkan Cevher,et al.  Bilinear Generalized Approximate Message Passing , 2013, ArXiv.

[6]  Andrea Montanari,et al.  The dynamics of message passing on dense graphs, with applications to compressed sensing , 2010, ISIT.

[7]  S. Stenholm Information, Physics and Computation, by Marc Mézard and Andrea Montanari , 2010 .

[8]  Andrea Montanari,et al.  The dynamics of message passing on dense graphs, with applications to compressed sensing , 2010, 2010 IEEE International Symposium on Information Theory.

[9]  Laurent Daudet,et al.  Boltzmann Machine and Mean-Field Approximation for Structured Sparse Decompositions , 2012, IEEE Transactions on Signal Processing.

[10]  Arian Maleki,et al.  Optimally Tuned Iterative Reconstruction Algorithms for Compressed Sensing , 2009, IEEE Journal of Selected Topics in Signal Processing.

[11]  Sebastian Fischer,et al.  Exploring Artificial Intelligence In The New Millennium , 2016 .

[12]  Michael I. Jordan,et al.  Graphical Models, Exponential Families, and Variational Inference , 2008, Found. Trends Mach. Learn..

[13]  I JordanMichael,et al.  Graphical Models, Exponential Families, and Variational Inference , 2008 .

[14]  Andrea Montanari,et al.  Graphical Models Concepts in Compressed Sensing , 2010, Compressed Sensing.

[15]  Florent Krzakala,et al.  Phase Transitions and Sample Complexity in Bayes-Optimal Matrix Factorization , 2014, IEEE Transactions on Information Theory.

[16]  D. Donoho,et al.  Sparse nonnegative solution of underdetermined linear equations by linear programming. , 2005, Proceedings of the National Academy of Sciences of the United States of America.

[17]  Volkan Cevher,et al.  Bilinear Generalized Approximate Message Passing—Part I: Derivation , 2013, IEEE Transactions on Signal Processing.

[18]  William T. Freeman,et al.  PROPOGATION AND ITS GENERALIZATIONS , 2003 .

[19]  Volkan Cevher,et al.  Fixed Points of Generalized Approximate Message Passing With Arbitrary Matrices , 2013, IEEE Transactions on Information Theory.

[20]  Sergio Verdú,et al.  Optimal Phase Transitions in Compressed Sensing , 2011, IEEE Transactions on Information Theory.

[21]  Andrea Montanari,et al.  Message passing algorithms for compressed sensing: I. motivation and construction , 2009, 2010 IEEE Information Theory Workshop on Information Theory (ITW 2010, Cairo).

[22]  Eric Jones,et al.  SciPy: Open Source Scientific Tools for Python , 2001 .

[23]  Sundeep Rangan,et al.  Generalized approximate message passing for estimation with random linear mixing , 2010, 2011 IEEE International Symposium on Information Theory Proceedings.

[24]  Florent Krzakala,et al.  Statistical physics-based reconstruction in compressed sensing , 2011, ArXiv.