Constraints on learning in dynamic synapses
暂无分享,去创建一个
Some constraints intrinsic to unsupervised learning in at tractor neural networks (ANN) are discussed. Hebbian type learning is discussed in a network whose synapses are analog, dynamic variables, with a fixed finite number of states that are stable on long time scales. It is shown that if the patterns to be learned are random words of ±1 bits then in the limit of slow presentation, the network can learn at most O(ln N) patterns in N neurons. Going beyond the logarithmic contraint requires stochastic learning of patterns and low coding rate.
[1] J. Hopfield,et al. Dynamic properties of neural networks with adapting synapses , 1992 .
[2] J J Hopfield,et al. Neural networks and physical systems with emergent collective computational abilities. , 1982, Proceedings of the National Academy of Sciences of the United States of America.