A key aspect of neural network application is determining how to select the network architecture, and how to use the data to optimize performance and to facilitate proper evaluation. Ontogenic (evolutionary) networks address these problems by leaving the user with minimal parameter selection and decision making, and utilizing available data more fully. We explain and illustrate this idea by comparing the classification performance for backpropagation and an ontogenic neural network. The ontogenic network is shown to have significant advantages in classification accuracy, speed, and utilization of data; furthermore, it does not require the user to determine the number of hidden units or any learning rate parameters, which avoids the trial and error approach necessary for both backpropagation and radial basis functions. In previous tests, ontogenic nets showed the ability to match backpropagation's results in a fraction of the training time. In addition, one of those problems demonstrated a further advantage of the network: by refraining from responding to patterns lying outside of the region on which it was trained, the network does not attempt to extrapolate and may signal that training and test sets lack adequate compatibility. Two new test problems arising out of a beam vibration minimization application in smart structures show the ability of the network to find solutions to low dimensional complex mappings for classifications which pose great difficulty for backpropagation.
[1]
W. M. Winslow.
Induced Fibration of Suspensions
,
1949
.
[2]
John Moody,et al.
Fast Learning in Networks of Locally-Tuned Processing Units
,
1989,
Neural Computation.
[3]
David E. Rumelhart,et al.
Predicting the Future: a Connectionist Approach
,
1990,
Int. J. Neural Syst..
[4]
John E. Moody,et al.
The Effective Number of Parameters: An Analysis of Generalization and Regularization in Nonlinear Learning Systems
,
1991,
NIPS.
[5]
Christian Lebiere,et al.
The Cascade-Correlation Learning Architecture
,
1989,
NIPS.
[6]
Ka-Yiu San,et al.
Process identification using neural networks
,
1992
.
[7]
Kiyotoshi Matsuoka,et al.
Noise injection into inputs in back-propagation learning
,
1992,
IEEE Trans. Syst. Man Cybern..
[8]
Stephen Grossberg,et al.
ARTMAP: supervised real-time learning and classification of nonstationary data by a self-organizing neural network
,
1991,
[1991 Proceedings] IEEE Conference on Neural Networks for Ocean Engineering.