Tightness Results for Local Consistency Relaxations in Continuous MRFs

Finding the MAP assignment in graphical models is a challenging task that generally requires approximations. One popular approximation approach is to use linear programming relaxations that enforce local consistency. While these are commonly used for discrete variable models, they are much less understood for models with continuous variables. Here we define local consistency relaxations of MAP for continuous pairwise Markov Random Fields (MRFs), and analyze their properties. We begin by providing a characterization of models for which this relaxation is tight. These turn out to be models that can be reparameterized as a sum of local convex functions. We also provide a simple formulation of this relaxation for Gaussian MRFs. Next, we show how the above insights can be used to obtain optimality certificates for loopy belief propagation (LBP) in such models. Specifically, we show that the messages of LBP can be used to calculate upper and lower bounds on the MAP value, and that these bounds coincide at convergence, yielding a natural stopping criterion which was not previously available. Finally, our results illustrate a close connection between local consistency relaxations of MAP and LBP. They demonstrate that in the continuous case, whenever LBP is provably optimal so is the local consistency relaxation.

[1]  D. Sontag 1 Introduction to Dual Decomposition for Inference , 2010 .

[2]  Sebastian Fischer,et al.  Exploring Artificial Intelligence In The New Millennium , 2016 .

[3]  Amir Globerson,et al.  Higher Order Matching for Consistent Multiple Target Tracking , 2013, 2013 IEEE International Conference on Computer Vision.

[4]  Michael I. Jordan,et al.  Graphical Models, Exponential Families, and Variational Inference , 2008, Found. Trends Mach. Learn..

[5]  Solomon Eyal Shimony,et al.  Finding MAPs for Belief Networks is NP-Hard , 1994, Artif. Intell..

[6]  Richard Heusdens,et al.  Linear coordinate-descent message-passing for quadratic optimization , 2012, ICASSP.

[7]  Dmitry M. Malioutov,et al.  Linear programming analysis of loopy belief propagation for weighted matching , 2007, NIPS.

[8]  Pushmeet Kohli,et al.  A Convex Discrete-Continuous Approach for Markov Random Fields , 2012, ECCV.

[9]  Rüdiger L. Urbanke,et al.  Design of capacity-approaching irregular low-density parity-check codes , 2001, IEEE Trans. Inf. Theory.

[10]  Benjamin Van Roy,et al.  Convergence of the Min-Sum Algorithm for Convex Optimization , 2007, 0705.4253.

[11]  Devavrat Shah,et al.  Message Passing for Maximum Weight Independent Set , 2008, IEEE Transactions on Information Theory.

[12]  Martin J. Wainwright,et al.  Using linear programming to Decode Binary linear codes , 2005, IEEE Transactions on Information Theory.

[13]  Nir Friedman,et al.  Probabilistic Graphical Models - Principles and Techniques , 2009 .

[14]  Tommi S. Jaakkola,et al.  Approximate inference in graphical models using lp relaxations , 2010 .

[15]  David A. McAllester,et al.  Convex max-product algorithms for continuous MRFs with applications to protein folding , 2011, ICML 2011.

[16]  Ben Taskar,et al.  Structured Prediction, Dual Extragradient and Bregman Projections , 2006, J. Mach. Learn. Res..

[17]  William T. Freeman,et al.  Understanding belief propagation and its generalizations , 2003 .

[18]  Tommi S. Jaakkola,et al.  Fixing Max-Product: Convergent Message Passing Algorithms for MAP LP-Relaxations , 2007, NIPS.

[19]  M. Bayati,et al.  Max-Product for Maximum Weight Matching: Convergence, Correctness, and LP Duality , 2008, IEEE Transactions on Information Theory.

[20]  P. Vontobel,et al.  Graph-covers and iterative decoding of nite length codes , 2003 .

[21]  Tommi S. Jaakkola,et al.  Tightening LP Relaxations for MAP using Message Passing , 2008, UAI.

[22]  Dmitry M. Malioutov,et al.  Walk-Sums and Belief Propagation in Gaussian Graphical Models , 2006, J. Mach. Learn. Res..

[23]  Sekhar Tatikonda,et al.  Message-Passing Algorithms: Reparameterizations and Splittings , 2010, IEEE Transactions on Information Theory.

[24]  Yair Weiss,et al.  MAP Estimation, Linear Programming and Belief Propagation with Convex Free Energies , 2007, UAI.

[25]  Sekhar Tatikonda,et al.  Message-passing algorithms for quadratic minimization , 2012, J. Mach. Learn. Res..

[26]  P. Vontobel,et al.  On the Relationship between Linear Programming Decoding and Min-Sum Algorithm Decoding , 2004 .

[27]  Richard Heusdens,et al.  Generalized linear coordinate-descent message-passing for convex optimization , 2012, 2012 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[28]  Benjamin Van Roy,et al.  Convergence of Min-Sum Message Passing for Quadratic Optimization , 2006, IEEE Transactions on Information Theory.

[29]  Jason K. Johnson,et al.  Convex relaxation methods for graphical models: Lagrangian and maximum entropy approaches , 2008 .

[30]  Tommi S. Jaakkola,et al.  Introduction to dual composition for inference , 2011 .

[31]  Richard Heusdens,et al.  Linear Coordinate-Descent Message Passing for Quadratic Optimization , 2012, Neural Computation.

[32]  Danny Bickson,et al.  Gaussian Belief Propagation: Theory and Aplication , 2008, 0811.2518.

[33]  Andrea Montanari,et al.  Graphical Models Concepts in Compressed Sensing , 2010, Compressed Sensing.

[34]  Vladimir Kolmogorov,et al.  Convergent Tree-Reweighted Message Passing for Energy Minimization , 2006, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[35]  Dimitri P. Bertsekas,et al.  Nonlinear Programming , 1997 .

[36]  Yair Weiss,et al.  Minimizing and Learning Energy Functions for Side-Chain Prediction , 2008, J. Comput. Biol..

[37]  Sanjeev Arora,et al.  Message-Passing Algorithms and Improved LP Decoding , 2012, IEEE Trans. Inf. Theory.

[38]  Martin J. Wainwright,et al.  LP Decoding Corrects a Constant Fraction of Errors , 2004, IEEE Transactions on Information Theory.

[39]  Guy Even,et al.  Message-Passing Algorithms for Packing and Covering Linear Programs with Zero-One Matrices , 2013, ArXiv.

[40]  William T. Freeman,et al.  Correctness of Belief Propagation in Gaussian Graphical Models of Arbitrary Topology , 1999, Neural Computation.

[41]  Brendan J. Frey,et al.  Graph Cuts is a Max-Product Algorithm , 2011, UAI.

[42]  Botond Cseke,et al.  Properties of Bethe Free Energies and Message Passing in Gaussian Models , 2011, J. Artif. Intell. Res..