Building a hierarchy with neural networks: an example-image vector quantization.

Electronic neural networks can perform the function of associative memory. Given an input pattern, the network searches through its stored memories to find which of them best matches the input. Thus the network does a combination of content-addressable search and error correction. The number of random memories that a network can store is limited to a fraction of the number of electronic neurons in the circuit. We propose a method for building a hierarchy of networks that allows the fast parallel search through a list of memories that is too large to store in a single network. We have demonstrated the principle of this approach by an example in image vector quantization.

[1]  Stephen Grossberg,et al.  Competitive Learning: From Interactive Activation to Adaptive Resonance , 1987, Cogn. Sci..

[2]  Stephen Grossberg,et al.  From Interactive Activation to Adaptive Resonance , 1987 .

[3]  Stephen Grossberg,et al.  Studies of mind and brain , 1982 .

[4]  H. Graf,et al.  A CMOS associative memory chip based on neural networks , 1987, 1987 IEEE International Solid-State Circuits Conference. Digest of Technical Papers.

[5]  J. Hopfield,et al.  Computing with neural circuits: a model. , 1986, Science.