Inference and learning in sparse systems with multiple states.

We discuss how inference can be performed when data are sampled from the nonergodic phase of systems with multiple attractors. We take as a model system the finite connectivity Hopfield model in the memory phase and suggest a cavity method approach to reconstruct the couplings when the data are separately sampled from few attractor states. We also show how the inference results can be converted into a learning protocol for neural networks in which patterns are presented through weak external fields. The protocol is simple and fully local, and is able to store patterns with a finite overlap with the input patterns without ever reaching a spin-glass phase where all memories are lost.

[1]  Vladimir Kolmogorov,et al.  Convergent Tree-Reweighted Message Passing for Energy Minimization , 2006, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[2]  R. Palmer,et al.  Solution of 'Solvable model of a spin glass' , 1977 .

[3]  Michael J. Berry,et al.  Weak pairwise correlations imply strongly correlated network states in a neural population , 2005, Nature.

[4]  Sompolinsky,et al.  Storing infinite numbers of patterns in a spin-glass model of neural networks. , 1985, Physical review letters.

[5]  Toshiyuki TANAKA Mean-field theory of Boltzmann machine learning , 1998 .

[6]  J. Hertz,et al.  Mean field theory for nonequilibrium network reconstruction. , 2010, Physical review letters.

[7]  D. Vernon Inform , 1995, Encyclopedia of the UN Sustainable Development Goals.

[8]  M. Mézard,et al.  Spin Glass Theory and Beyond , 1987 .

[9]  E. Aurell,et al.  Dynamics and performance of susceptibility propagation on synthetic data , 2010, 1005.3694.

[10]  W. Marsden I and J , 2012 .

[11]  M. Mézard,et al.  The Bethe lattice spin glass revisited , 2000, cond-mat/0009418.

[12]  J J Hopfield,et al.  Neural networks and physical systems with emergent collective computational abilities. , 1982, Proceedings of the National Academy of Sciences of the United States of America.

[13]  E. Marinari,et al.  Intrinsic limitations of the susceptibility propagation inverse inference for the mean field Ising spin glass , 2010 .

[14]  Haiping Huang Reconstructing the Hopfield network as an inverse Ising problem. , 2009, Physical review. E, Statistical, nonlinear, and soft matter physics.

[15]  Ieee Xplore,et al.  IEEE Transactions on Pattern Analysis and Machine Intelligence Information for Authors , 2022, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[16]  Gerhard Lakemeyer,et al.  Exploring artificial intelligence in the new millennium , 2003 .

[17]  R. Monasson,et al.  Small-correlation expansions for the inverse Ising problem , 2008, 0811.3574.

[18]  R. Glauber Time‐Dependent Statistics of the Ising Model , 1963 .

[19]  V. Akila,et al.  Information , 2001, The Lancet.

[20]  Hilbert J. Kappen,et al.  Efficient Learning in Boltzmann Machines Using Linear Response Theory , 1998, Neural Computation.

[21]  G. G. Stokes "J." , 1890, The New Yale Book of Quotations.

[22]  Nicolas Brunel,et al.  Efficient supervised learning in networks with binary synapses , 2007, BMC Neuroscience.

[23]  S. Kak Information, physics, and computation , 1996 .

[24]  Brendan J. Frey,et al.  Factor graphs and the sum-product algorithm , 2001, IEEE Trans. Inf. Theory.

[25]  Erik Aurell,et al.  Frontiers in Computational Neuroscience , 2022 .

[26]  Thierry Mora,et al.  Constraint satisfaction problems and neural networks: A statistical physics perspective , 2008, Journal of Physiology-Paris.

[27]  B. Wemmenhove,et al.  Finite connectivity attractor neural networks , 2003 .

[28]  S. Leibler,et al.  Neuronal couplings between retinal ganglion cells inferred by efficient inverse statistical physics methods , 2009, Proceedings of the National Academy of Sciences.