Local entropy as a measure for sampling solutions in Constraint Satisfaction Problems

We introduce a novel Entropy-driven Monte Carlo (EdMC) strategy to efficiently sample solutions of random Constraint Satisfaction Problems (CSPs). First, we extend a recent result that, using a large-deviation analysis, shows that the geometry of the space of solutions of the Binary Perceptron Learning Problem (a prototypical CSP), contains regions of very high-density of solutions. Despite being sub-dominant, these regions can be found by optimizing a local entropy measure. Building on these results, we construct a fast solver that relies exclusively on a local entropy estimate, and can be applied to general CSPs. We describe its performance not only for the Perceptron Learning Problem but also for the random $K$-Satisfiabilty Problem (another prototypical CSP with a radically different structure), and show numerically that a simple zero-temperature Metropolis search in the smooth local entropy landscape can reach sub-dominant clusters of optimal solutions in a small number of steps, while standard Simulated Annealing either requires extremely long cooling procedures or just fails. We also discuss how the EdMC can heuristically be made even more efficient for the cases we studied.

[1]  S. Stenholm Information, Physics and Computation, by Marc Mézard and Andrea Montanari , 2010 .

[2]  Riccardo Zecchina,et al.  Learning by message-passing in networks of discrete synapses , 2005, Physical review letters.

[3]  Rémi Monasson,et al.  Statistical mechanics methods and phase transitions in optimization problems , 2001, Theor. Comput. Sci..

[4]  Carlo Baldassi,et al.  A Max-Sum algorithm for training discrete neural networks , 2015, ArXiv.

[5]  Andrea Montanari,et al.  Gibbs states and the set of solutions of random constraint satisfaction problems , 2006, Proceedings of the National Academy of Sciences.

[6]  Y. Kabashima,et al.  Weight space structure and analysis using a finite replica number in the Ising perceptron , 2009, 0910.2281.

[7]  Nicolas Brunel,et al.  Efficient supervised learning in networks with binary synapses , 2007, BMC Neuroscience.

[8]  Riccardo Zecchina,et al.  Survey propagation: An algorithm for satisfiability , 2002, Random Struct. Algorithms.

[9]  M. Mézard,et al.  Random K-satisfiability problem: from an analytic solution to an efficient algorithm. , 2002, Physical review. E, Statistical, nonlinear, and soft matter physics.

[10]  Riccardo Zecchina,et al.  Entropy landscape and non-Gibbs solutions in constraint satisfaction problems , 2008, Physical review. E, Statistical, nonlinear, and soft matter physics.

[11]  M. Mézard,et al.  Analytic and Algorithmic Solution of Random Satisfiability Problems , 2002, Science.

[12]  Hui Ma,et al.  From one solution of a 3-satisfiability formula to a solution cluster: frozen variables and entropy. , 2008, Physical review. E, Statistical, nonlinear, and soft matter physics.

[13]  Andrea Montanari,et al.  Clusters of solutions and replica symmetry breaking in random k-satisfiability , 2008, ArXiv.

[14]  Thierry Mora,et al.  Clustering of solutions in the random satisfiability problem , 2005, Physical review letters.

[15]  Yoshiyuki Kabashima,et al.  Entropy landscape of solutions in the binary perceptron problem , 2013, ArXiv.

[16]  Carlo Baldassi Generalization Learning in a Perceptron with Binary Synapses , 2009, 1211.3024.

[17]  Rémi Monasson,et al.  Determining computational complexity from characteristic ‘phase transitions’ , 1999, Nature.

[18]  W. Krauth,et al.  Storage capacity of memory networks with binary couplings , 1989 .

[19]  Yoshiyuki Kabashima,et al.  Origin of the computational hardness for learning with binary synapses , 2014, Physical review. E, Statistical, nonlinear, and soft matter physics.

[20]  M. Mézard The space of interactions in neural networks: Gardner's computation with the cavity method , 1989 .

[21]  E. Gardner,et al.  Three unfinished works on the optimal storage capacity of networks , 1989 .

[22]  Carlo Baldassi,et al.  Subdominant Dense Clusters Allow for Simple Learning and High Computational Performance in Neural Networks with Discrete Synapses. , 2015, Physical review letters.

[23]  Hui Ma,et al.  Communities of solutions in single solution clusters of a random K-satisfiability formula. , 2009, Physical review. E, Statistical, nonlinear, and soft matter physics.