On learning mean values in Hopfield associative memories trained with noisy examples using the Hebb rule

We study, using standard Probability Theory results, the ability of the Hopfield model of associative memory using the Hebb rule to learn mean values from examples in the presence of noise. We state and prove properties concerning this ability.

[1]  Bruno Cernuschi-Frías,et al.  A neural network model of memory under stress , 1997, IEEE Trans. Syst. Man Cybern. Part B.

[2]  J J Hopfield,et al.  Neural networks and physical systems with emergent collective computational abilities. , 1982, Proceedings of the National Academy of Sciences of the United States of America.

[3]  Santosh S. Venkatesh,et al.  The capacity of the Hopfield associative memory , 1987, IEEE Trans. Inf. Theory.

[4]  Michael Kearns,et al.  Computational complexity of machine learning , 1990, ACM distinguished dissertations.

[5]  Yaser S. Abu-Mostafa,et al.  Information capacity of the Hopfield model , 1985, IEEE Trans. Inf. Theory.

[6]  Bruno Cernuschi-Frías,et al.  Partial simultaneous updating in Hopfield memories , 1989, IEEE Trans. Syst. Man Cybern..