S/sub -/ART: a modified ART 2-A algorithm with rapid intermediate learning capabilities

A modification to the ART 2-A algorithm is presented, called S_ART, which speeds up ART 2-A in the intermediate learning case. By attaching a different learning rate to each output node, which is decreased through time according to the amount of learning the particular node has had, the required training-presentation time is significantly reduced, and thus a speedup of up to two orders of magnitude can be achieved. Learning-rate convergence can be "tweaked" by using a 'contribution' parameter which sets the lower-limiting value to which the learning rate decreases. An example of the clustering characteristics of the S_ART is given and compared to ART 2-A for rapid learning of hand-written signatures from a supplied database. Here, we use both S_ART and ART 2-A as ART/sub a/ networks in an ARTMAP system. The ART/sub b/ network is an ART 1 neural network.<<ETX>>