Power law decay of stored pattern stability in sparse Hopfield neural networks

Hopfield neural networks on scale-free networks display the power law relation between the stability of patterns and the number of patterns. The stability is measured by the overlap between the output state and the stored pattern which is presented to a neural network. In simulations the overlap declines to a constant by a power law decay. Here we provide the explanation for the power law behavior through the signal-to-noise ratio analysis. We show that on sparse networks storing a plenty of patterns the stability of stored patterns can be approached by a power law function with the exponent −0.5. There is a difference between analytic and simulation results that the analytic results of overlap decay to 0. The difference exists because the signal and noise term of nodes diverge from the mean-field approach in the sparse finite size networks.

[1]  Bokui Chen,et al.  Average number of fixed points and attractors in Hopfield neural networks , 2018, International Journal of Modern Physics C.

[2]  Sheng-Jun Wang,et al.  Effect of similarity between patterns in associative memory. , 2017, Physical review. E.

[3]  J. Wixted,et al.  Genuine power curves in forgetting: A quantitative analysis of individual subject forgetting functions , 1997, Memory & cognition.

[4]  L. da Fontoura Costa,et al.  Efficient Hopfield pattern recognition on a scale-free neural network , 2002, cond-mat/0212601.

[5]  S.-I. Amari,et al.  Neural theory of association and concept-formation , 1977, Biological Cybernetics.

[6]  Lihong Huang,et al.  Stability analysis of a delayed Hopfield neural network. , 2003, Physical review. E, Statistical, nonlinear, and soft matter physics.

[7]  J. Wixted,et al.  On the Form of Forgetting , 1991 .

[8]  Albert-László Barabási,et al.  Statistical mechanics of complex networks , 2001, ArXiv.

[9]  J J Hopfield,et al.  Neural networks and physical systems with emergent collective computational abilities. , 1982, Proceedings of the National Academy of Sciences of the United States of America.

[10]  I. Epstein,et al.  Response of complex networks to stimuli. , 2004, Proceedings of the National Academy of Sciences of the United States of America.

[11]  Michael Menzinger,et al.  Topology and computational performance of attractor neural networks. , 2003, Physical review. E, Statistical, nonlinear, and soft matter physics.

[12]  Tao Jin,et al.  Pattern recognition using asymmetric attractor neural networks. , 2005, Physical review. E, Statistical, nonlinear, and soft matter physics.

[13]  V. Latora,et al.  Complex networks: Structure and dynamics , 2006 .

[14]  Christopher T. Kello,et al.  Scaling laws in cognitive sciences , 2010, Trends in Cognitive Sciences.

[15]  Xin-Jian Xu,et al.  Sparse connection density underlies the maximal functional difference between random and scale-free networks , 2013 .

[16]  Xin-Jian Xu,et al.  Response of degree-correlated scale-free networks to stimuli. , 2007, Physical review. E, Statistical, nonlinear, and soft matter physics.

[17]  Beom Jun Kim Performance of networks of artificial neurons: the role of clustering. , 2004, Physical review. E, Statistical, nonlinear, and soft matter physics.

[18]  J. Luck,et al.  Slow synaptic dynamics in a network: from exponential to power-law forgetting. , 2014, Physical review. E, Statistical, nonlinear, and soft matter physics.

[19]  Boris Kryzhanovsky,et al.  Weighted Patterns as a Tool for Improving the Hopfield Model , 2012, Physical review. E, Statistical, nonlinear, and soft matter physics.

[20]  Takashi Odagaki,et al.  Storage capacity and retrieval time of small-world neural networks. , 2007, Physical review. E, Statistical, nonlinear, and soft matter physics.

[21]  T. Uezu,et al.  Retrieval Properties of Hopfield and Correlated Attractors in an Associative Memory Model , 2004, cond-mat/0403361.