Researchers worldwide converge into the fact that the exhaustive search over the space of network architectures is computationally infeasible even for networks of modest size. The use of heuristic strategies that dramatically reduce the search complexity is a common technique. These heuristic approaches employ directed search algorithms, such as selection of the number of nodes via sequential network construction (SNC), pruning inputs and weights via sensitivity based pruning (SBP) and optimal brain damage (OBD). The main disadvantage of the particular techniques is the production of one hidden layer only perceptrons and not a multiplayer network. As a consequence we cannot ensure that the resulted network will produce "optimal" results. This paper investigates the use of other heuristic strategies that produce multiplayer perceptrons. We propose three algorithms for the construction of networks and compare them with the classic approaches. Simulation results show that these methods lead to "near-optimal" network structures by engaging error-related information.
[1]
G. Schwarz.
Estimating the Dimension of a Model
,
1978
.
[2]
Elie Bienenstock,et al.
Neural Networks and the Bias/Variance Dilemma
,
1992,
Neural Computation.
[3]
Leonard G. C. Hamey,et al.
XOR has no local minima: A case study in neural network error surface analysis
,
1998,
Neural Networks.
[4]
Ida G. Sprinkhuizen-Kuyper,et al.
The Error Surface of the Simplest XOR Network Has Only Global Minima
,
1996,
Neural Computation.
[5]
Timur Ash,et al.
Dynamic node creation in backpropagation networks
,
1989
.
[6]
Lutz Prechelt,et al.
PROBEN 1 - a set of benchmarks and benchmarking rules for neural network training algorithms
,
1994
.
[7]
Yong Liu,et al.
Neural Network Model Selection Using Asymptotic Jackknife Estimator and Cross-Validation Method
,
1992,
NIPS.