This artic,le introduces Adaptive Resonance Theor) 2-A (ART 2-A), an efjCicient algorithm that emulates the self-organizing pattern recognition and hypothesis testing properties of the ART 2 neural network architect~~rc, hut at a speed two to three orders of magnitude fbster. Analysis and simulations show how’ the ART 2-A systems correspond to ART 2 rivnamics at both the fast-learn limit and at intermediate learning rate.r. Intermediate ieurning rates permit fust commitment of category nodes hut slow recoding, analogous to properties of word frequency effects. encoding specificity ef@cts, and episodic memory. Better noise tolerunce is hereby achieved ti’ithout a loss of leurning stability. The ART 2 and ART 2-A systems are contrasted with the leader algorithm. The speed of ART 2-A makes pructical the use of ART 2 modules in large scale neural computation. Keywords-Neural networks, Pattern recognition. Category formation. Fast learning, Adaptive resonance.
[1]
Stephen Grossberg,et al.
Art 2: Self-Organization Of Stable Category Recognition Codes For Analog Input Patterns
,
1988,
Other Conferences.
[2]
Stephen Grossberg,et al.
ART 3: Hierarchical search using chemical transmitters in self-organizing pattern recognition architectures
,
1990,
Neural Networks.
[3]
S Grossberg,et al.
Neural dynamics of word recognition and recall: attentional priming, learning, and resonance.
,
1986,
Psychological review.
[4]
S. Grossberg.
Nonlinear difference-differential equations in prediction and learning theory.
,
1967,
Proceedings of the National Academy of Sciences of the United States of America.