A biologically constrained learning mechanism in networks of formal neurons

A new learning mechanism is proposed for networks of formal neurons analogous to Ising spin systems; it brings such models substantially closer to biological data in three respects: first, the learning procedure is applied initially to a network with random connections (which may be similar to a spin-glass system), instead of starting from a system void of any knowledge (as in the Hopfield model); second, the resultant couplings are not symmetrical; third, patterns can be stored without changing the sign of the coupling coefficients. It is shown that the storage capacity of such networks is similar to that of the Hopfield network, and that it is not significantly affected by the restriction of keeping the couplings' signs constant throughout the learning phase. Although this approach does not claim to model the central nervous system, it provides new insight on a frontier area between statistical physics, artificial intelligence, and neurobiology.

[1]  J J Hopfield,et al.  Neural networks and physical systems with emergent collective computational abilities. , 1982, Proceedings of the National Academy of Sciences of the United States of America.

[2]  K. Fischer Spin glasses (I) , 1983 .

[3]  I. Guyon,et al.  Information storage and retrieval in spin-glass like neural networks , 1985 .

[4]  Sompolinsky,et al.  Spin-glass models of neural networks. , 1985, Physical review. A, General physics.