An optimization methodology for artificial neural networks with backpropagation nets is proposed. The pattern classification capability of backpropagation nets is first employed to identify feasible and infeasible regions (classes) of the optimization problems. The identified class boundaries enclose multi-dimensional spaces within which optimization constraints were satisfied. After adopting different sigmoid functions, the same backpropagation net was utilized to perform function mapping of the objective function to reach the optimum. For both the classification and mapping processes, procedures for training and testing of data sets were developed and are outlined. Many factors that are important to the successful implementation are also discussed.<<ETX>>
[1]
Richard Lippmann,et al.
Review of Neural Networks for Speech Recognition
,
1989,
Neural Computation.
[2]
Richard P. Lippmann,et al.
An introduction to computing with neural nets
,
1987
.
[3]
Hecht-Nielsen.
Theory of the backpropagation neural network
,
1989
.
[4]
J. Hopfield,et al.
Computing with neural circuits: a model.
,
1986,
Science.
[5]
Shou-Jen Lee,et al.
Design optimization with back-propagation neural networks
,
1991,
J. Intell. Manuf..
[6]
Kunihiko Fukushima,et al.
Neocognitron: A hierarchical neural network capable of visual pattern recognition
,
1988,
Neural Networks.