Adaptive bidirectional associative memories.

Bidirectionality, forward and backward information flow, is introduced in neural networks to produce two-way associative search for stored stimulus-response associations (A(i),B(i)). Two fields of neurons, F(A) and F(B), are connected by an n x p synaptic marix M. Passing information through M gives one direction, passing information through its transpose M(T) gives the other. Every matrix is bidirectionally stable for bivalent and for continuous neurons. Paired data (A(i),B(i)) are encoded in M by summing bipolar correlation matrices. The bidirectional associative memory (BAM) behaves as a two-layer hierarchy of symmetrically connected neurons. When the neurons in F(A) and F(B) are activated, the network quickly evolves to a stable state of twopattern reverberation, or pseudoadaptive resonance, for every connection topology M. The stable reverberation corresponds to a system energy local minimum. An adaptive BAM allows M to rapidly learn associations without supervision. Stable short-term memory reverberations across F(A) and F(B) gradually seep pattern information into the long-term memory connections M, allowing input associations (A(i),B(i)) to dig their own energy wells in the network state space. The BAM correlation encoding scheme is extended to a general Hebbian learning law. Then every BAM adaptively resonates in the sense that all nodes and edges quickly equilibrate in a system energy local minimum. A sampling adaptive BAM results when many more training samples are presented than there are neurons in F(B) and F(B), but presented for brief pulses of learning, not allowing learning to fully or nearly converge. Learning tends to improve with sample size. Sampling adaptive BAMs can learn some simple continuous mappings and can rapidly abstract bivalent associations from several noisy gray-scale samples.

[1]  Stephen A. Ritz,et al.  Distinctive features, categorical perception, and probability learning: some applications of a neural model , 1977 .

[2]  J J Hopfield,et al.  Neural networks and physical systems with emergent collective computational abilities. , 1982, Proceedings of the National Academy of Sciences of the United States of America.

[3]  S. Grossberg How does a brain build a cognitive code , 1980 .

[4]  Geoffrey E. Hinton,et al.  Learning internal representations by error propagation , 1986 .

[5]  S. Grossberg Studies of mind and brain : neural principles of learning, perception, development, cognition, and motor control , 1982 .

[7]  S. Grossberg Contour Enhancement , Short Term Memory , and Constancies in Reverberating Neural Networks , 1973 .

[8]  BART KOSKO,et al.  Bidirectional associative memories , 1988, IEEE Trans. Syst. Man Cybern..

[9]  P. Werbos,et al.  Beyond Regression : "New Tools for Prediction and Analysis in the Behavioral Sciences , 1974 .

[10]  Teuvo Kohonen,et al.  Correlation Matrix Memories , 1972, IEEE Transactions on Computers.

[11]  Bart Kosko,et al.  Optical Bidirectional Associative Memories , 1987, Photonics West - Lasers and Applications in Science and Engineering.

[12]  Stephen Grossberg,et al.  Studies of mind and brain , 1982 .

[13]  J J Hopfield,et al.  Neurons with graded response have collective computational properties like those of two-state neurons. , 1984, Proceedings of the National Academy of Sciences of the United States of America.

[14]  Teuvo Kohonen,et al.  Self-Organization and Associative Memory , 1988 .

[15]  R. Hecht-Nielsen Counterpropagation networks. , 1987, Applied optics.

[16]  Stephen Grossberg,et al.  A massively parallel architecture for a self-organizing neural pattern recognition machine , 1988, Comput. Vis. Graph. Image Process..

[17]  Stephen Grossberg,et al.  A Theory of Human Memory: Self-Organization and Performance of Sensory-Motor Codes, Maps, and Plans , 1982 .

[18]  Bart Kosko,et al.  Fuzzy entropy and conditioning , 1986, Inf. Sci..