OPTIMAL FEED-FORWARD NEURAL NETWORK ARCHITECTURES
暂无分享,去创建一个
One of the most important problems that neural network designers face today is choosing an l n appropriate network size for a given application. Network size involves in the case of layered neura etwork architectures, the number of layers in a network, the number of nodes per layer, and the e f number of connections. Roughly speaking, a neural network implements a nonlinear mapping of th rom u =G (x ). The mapping function G is established during a training phase where the network s ( learns to correctly associate input patterns x to output patterns u . Given a set of training example x , u ), there is probably an infinite number of different size networks that can learn to map input pat-
[1] Marcus Frean,et al. The Upstart Algorithm: A Method for Constructing and Training Feedforward Neural Networks , 1990, Neural Computation.
[2] Christian Lebiere,et al. The Cascade-Correlation Learning Architecture , 1989, NIPS.
[3] Peter M. Todd,et al. Designing Neural Networks using Genetic Algorithms , 1989, ICGA.
[4] Lawrence D. Jackel,et al. Backpropagation Applied to Handwritten Zip Code Recognition , 1989, Neural Computation.