Bistable gradient networks. I. Attractors and pattern retrieval at low loading in the thermodynamic limit.
暂无分享,去创建一个
We examine the large-network, low-loading behavior of an attractor neural network, the so-called bistable gradient network (BGN), and compare it with that of the Hopfield network (HN). We use analytical and numerical methods to characterize the attractor states of the network and their basins of attraction. The energy landscape of BGN is more complex than that of the HN and depends on the strength of the coupling among units. At weak coupling, the BGN acts as a highly selective associative memory; the input must be close to the one of the stored patterns in order to be recognized. A category of spurious attractors occurs which is not present in the HN. Stronger coupling results in a transition to a more Hopfield-like regime with large basins of attraction. The basins of attraction for spurious attractors are noticeably suppressed compared to the Hopfield case, even though the Hebbian synaptic structure is the same and there is no stochastic noise.