Sparse Hopfield network reconstruction with ℓ1 regularization

Abstract We propose an efficient strategy to infer sparse Hopfield network based on magnetizations and pairwise correlations measured through Glauber samplings. This strategy incorporates the ℓ1 regularization into the Bethe approximation by a quadratic approximation to the log-likelihood, and is able to further reduce the inference error of the Bethe approximation without the regularization. The optimal regularization parameter is observed to be of the order of M−ν where M is the number of independent samples. The value of the scaling exponent depends on the performance measure. ν ≃ 0.5001 for root mean squared error measure while ν ≃ 0.2743 for misclassification rate measure. The efficiency of this strategy is demonstrated for the sparse Hopfield model, but the method is generally applicable to other diluted mean field models. In particular, it is simple in implementation without heavy computational cost.

[1]  Thierry Mora,et al.  Constraint satisfaction problems and neural networks: A statistical physics perspective , 2008, Journal of Physiology-Paris.

[2]  R. Monasson,et al.  Small-correlation expansions for the inverse Ising problem , 2008, 0811.3574.

[3]  Michael Elad,et al.  L1-L2 Optimization in Signal and Image Processing , 2010, IEEE Signal Processing Magazine.

[4]  R. Monasson,et al.  Adaptive Cluster Expansion for the Inverse Ising Problem: Convergence, Algorithm and Tests , 2011, 1110.5416.

[5]  B. Wemmenhove,et al.  Finite connectivity attractor neural networks , 2003 .

[6]  D. Amit,et al.  Statistical mechanics of neural networks near saturation , 1987 .

[7]  Yoshiyuki Kabashima,et al.  Erratum: A typical reconstruction limit of compressed sensing based on Lp-norm minimization , 2009, ArXiv.

[8]  E. Aurell,et al.  Inverse Ising inference using all the data. , 2011, Physical review letters.

[9]  Michael J. Berry,et al.  Weak pairwise correlations imply strongly correlated network states in a neural population , 2005, Nature.

[10]  R Zecchina,et al.  Inference and learning in sparse systems with multiple states. , 2011, Physical review. E, Statistical, nonlinear, and soft matter physics.

[11]  N. S. Skantzos,et al.  The Little–Hopfield model on a sparse random graph , 2003 .

[12]  D. Sherrington,et al.  A neural network with low symmetric connectivity , 1991 .

[13]  Sompolinsky,et al.  Neural networks with nonlinear synapses and a static noise. , 1986, Physical review. A, General physics.

[14]  Message passing algorithms for the Hopfield network reconstruction: threshold behavior and limitation. , 2010, Physical review. E, Statistical, nonlinear, and soft matter physics.

[15]  Haiping Huang Reconstructing the Hopfield network as an inverse Ising problem. , 2009, Physical review. E, Statistical, nonlinear, and soft matter physics.

[16]  J. Berg,et al.  Bethe–Peierls approximation and the inverse Ising problem , 2011, 1112.3501.

[17]  J J Hopfield,et al.  Neural networks and physical systems with emergent collective computational abilities. , 1982, Proceedings of the National Academy of Sciences of the United States of America.

[18]  Trevor Hastie,et al.  Regularization Paths for Generalized Linear Models via Coordinate Descent. , 2010, Journal of statistical software.

[19]  S. Cocco,et al.  Ising models for neural activity inferred via selective cluster expansion: structural and coding properties , 2013 .

[20]  W. Marsden I and J , 2012 .

[21]  F. Ricci-Tersenghi The Bethe approximation for solving the inverse Ising problem: a comparison with other inference methods , 2011, 1112.4814.

[22]  Ericka Stricklin-Parker,et al.  Ann , 2005 .

[23]  S. Leibler,et al.  Neuronal couplings between retinal ganglion cells inferred by efficient inverse statistical physics methods , 2009, Proceedings of the National Academy of Sciences.

[24]  J. Lafferty,et al.  High-dimensional Ising model selection using ℓ1-regularized logistic regression , 2010, 1010.0311.

[25]  Tomoki Fukai,et al.  RANDOM AND SYSTEMATIC DILUTIONS OF SYNAPTIC CONNECTIONS IN A NEURAL NETWORK WITH A NONMONOTONIC RESPONSE FUNCTION , 1998 .

[26]  G. G. Stokes "J." , 1890, The New Yale Book of Quotations.

[27]  PROCEssIng magazInE IEEE Signal Processing Magazine , 2004 .

[28]  M. Weigt,et al.  Inference algorithms for gene networks: a statistical mechanics analysis , 2008, 0812.0940.

[29]  Andrea Pagnani,et al.  Statistical mechanics of sparse generalization and graphical model selection , 2009 .

[30]  R. Monasson,et al.  High-dimensional inference with the generalized Hopfield model: principal component analysis and corrections. , 2011, Physical review. E, Statistical, nonlinear, and soft matter physics.

[31]  C. Sander,et al.  Direct-coupling analysis of residue coevolution captures native contacts across many protein families , 2011, Proceedings of the National Academy of Sciences.

[32]  Haiping Huang State sampling dependence of hopfield network inference , 2011, 1104.4931.

[33]  R. Rosenfeld Nature , 2009, Otolaryngology--head and neck surgery : official journal of American Academy of Otolaryngology-Head and Neck Surgery.

[34]  Andrea Montanari,et al.  The LASSO risk: asymptotic results and real world examples , 2010, NIPS.