Learning Multiple Belief Propagation Fixed Points for Real Time Inference

In the context of inference with expectation constraints, we propose an approach based on the “loopy belief propagation” algorithm (lpb), as a surrogate to an exact Markov Random Field (mrf) modelling. A prior information composed of correlations among a large set of N variables, is encoded into a graphical model; this encoding is optimized with respect to an approximate decoding procedure (lbp), which is used to infer hidden variables from an observed subset. We focus on the situation where the underlying data have many different statistical components, representing a variety of independent patterns. Considering a single parameter family of models we show how lpb may be used to encode and decode efficiently such information, without solving the NP-hard inverse problem yielding the optimal mrf. Contrary to usual practice, we work in the non-convex Bethe free energy minimization framework, and manage to associate a belief propagation fixed point to each component of the underlying probabilistic mixture. The mean field limit is considered and yields an exact connection with the Hopfield model at finite temperature and steady state, when the number of mixture components is proportional to the number of variables. In addition, we provide an enhanced learning procedure, based on a straightforward multi-parameter extension of the model in conjunction with an effective continuous optimization procedure. This is performed using the stochastic search heuristic cmaes and yields a significant improvement with respect to the single parameter basic model.

[1]  W. Freeman,et al.  Generalized Belief Propagation , 2000, NIPS.

[2]  Arnaud de La Fortelle,et al.  A Belief Propagation Approach to Traffic Prediction using Probe Vehicles , 2007, 2007 IEEE Intelligent Transportation Systems Conference.

[3]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[4]  Yair Weiss,et al.  MAP Estimation, Linear Programming and Belief Propagation with Convex Free Energies , 2007, UAI.

[5]  J J Hopfield,et al.  Neural networks and physical systems with emergent collective computational abilities. , 1982, Proceedings of the National Academy of Sciences of the United States of America.

[6]  大西 仁,et al.  Pearl, J. (1988, second printing 1991). Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference. Morgan-Kaufmann. , 1994 .

[7]  William T. Freeman,et al.  Constructing free-energy approximations and generalized belief propagation algorithms , 2005, IEEE Transactions on Information Theory.

[8]  Nikolaus Hansen,et al.  Completely Derandomized Self-Adaptation in Evolution Strategies , 2001, Evolutionary Computation.

[9]  D. Amit,et al.  Statistical mechanics of neural networks near saturation , 1987 .

[10]  Judea Pearl,et al.  Probabilistic reasoning in intelligent systems , 1988 .

[11]  Yoshiyuki Kabashima,et al.  Belief propagation vs. TAP for decoding corrupted messages , 1998 .

[12]  T. Heskes Stable Fixed Points of Loopy Belief Propagation Are Minima of the Bethe Free Energy , 2002 .

[13]  Sompolinsky,et al.  Spin-glass models of neural networks. , 1985, Physical review. A, General physics.

[14]  Thierry Mora,et al.  Constraint satisfaction problems and neural networks: A statistical physics perspective , 2008, Journal of Physiology-Paris.

[15]  Martin J. Wainwright,et al.  Stochastic processes on graphs with cycles: geometric and variational approaches , 2002 .

[16]  Tom Heskes,et al.  Fractional Belief Propagation , 2002, NIPS.

[17]  Martin J. Wainwright,et al.  Estimating the "Wrong" Graphical Model: Benefits in the Computation-Limited Setting , 2006, J. Mach. Learn. Res..

[18]  Sekhar Tatikonda,et al.  Loopy Belief Propogation and Gibbs Measures , 2002, UAI.

[19]  M. Talagrand Rigorous results for the Hopfield model with many patterns , 1998 .

[20]  W. Wiegerinck,et al.  Approximate inference techniques with expectation constraints , 2005 .

[21]  Yee Whye Teh,et al.  Approximate inference in Boltzmann machines , 2003, Artif. Intell..

[22]  Hilbert J. Kappen,et al.  Sufficient Conditions for Convergence of the Sum–Product Algorithm , 2005, IEEE Transactions on Information Theory.

[23]  H. Bethe Statistical Theory of Superlattices , 1935 .

[24]  Martin J. Wainwright,et al.  Tree-reweighted belief propagation algorithms and approximate ML estimation by pseudo-moment matching , 2003, AISTATS.

[25]  John W. Fisher,et al.  Loopy Belief Propagation: Convergence and Effects of Message Errors , 2005, J. Mach. Learn. Res..

[26]  Brendan J. Frey,et al.  Factor graphs and the sum-product algorithm , 2001, IEEE Trans. Inf. Theory.

[27]  M. Mézard,et al.  Spin Glass Theory and Beyond , 1987 .

[28]  Tom Heskes,et al.  Stable Fixed Points of Loopy Belief Propagation Are Local Minima of the Bethe Free Energy , 2002, NIPS.