Learning Constraints in Storage Capacity in Networks with Dynamic Synapses
暂无分享,去创建一个
Some constraints intrinsic to unsupervised learning in attractor neural networks (ANN) are discussed. We present a very simple realizable model of ANN capable of dynamically learning and classifying input stimuli in a totally unsupervised fashion. The synapses of the network are analog dynamic variables whose values have to be periodically refreshed to avoid memory loss. Two refreshing mechanisms are discussed: the first one is a periodic deterministic refresh while the second one acts stochastically. Then some typical learning scenarios are described and constraints on storage capacity are exposed: in the worst case a network of N neurons can learn at most O(ln N) patterns while in the best case (stochastic learning) the number of stored patterns cannot surpass . We have come across these constraints in connection with a design of an organically learning ANN, implemented in silicon.