The problem of building robust neural network classification algorithms that perform well on a wide variety of problems is addressed. Evidence of the fact that current procedures are only accurate on a restricted range of tasks is presented. The authors propose multimodule architectures based on the cooperation of two or more neural net techniques as a solution. This idea is illustrated by the cooperation of a multilayer perceptron and a learning vector quantization algorithm. It is shown that this method combines the advantages of its individual components and is much more robust. It is accurate for a large range of problems and is easy to tune. An algorithm that allows the direct training of this multimodule architecture is described. The use of this technique enhances performance when dealing with classification problems
[1]
Leo Breiman,et al.
Classification and Regression Trees
,
1984
.
[2]
Geoffrey E. Hinton,et al.
Learning internal representations by error propagation
,
1986
.
[3]
Patrick Gallinari,et al.
Multilayer perceptrons and data analysis
,
1988,
IEEE 1988 International Conference on Neural Networks.
[4]
Teuvo Kohonen.
Optical Associative Memories
,
1988
.
[5]
Teuvo Kohonen,et al.
Self-Organization and Associative Memory, Third Edition
,
1989,
Springer Series in Information Sciences.