Optimal Design of Neural Nets Using Hybrid Algorithms

Selection of the topology of a network and correct parameters for the learning algorithm is a tedious task for designing an optimal Artificial Neural Network (ANN), which is smaller, faster and with a better generalization performance. Genetic algorithm (GA) is an adaptive search technique based on the principles and mechanisms of natural selection and survival of the fittest from natural evolution. Simulated annealing (SA) is a global optimization algorithm that can process cost functions possessing quite arbitrary degrees of nonlinearities, discontinuities and stochasticity but statistically assuring a optimal solution. In this paper we explain how a hybrid algorithm integrating the desirable aspects of GA and SA can be applied for the optimal design of an ANN. This paper is more concerned with the understanding of current theoretical developments of Evolutionary Artificial Neural Networks (EANNs) using GAs and other heuristic procedures and how the proposed hybrid and other heuristic procedures can be combined to produce an optimal ANN.

[1]  Xin Yao,et al.  A new simulated annealing algorithm , 1995, Int. J. Comput. Math..

[2]  Xin Yao,et al.  Evolving artificial neural networks , 1999, Proc. IEEE.

[3]  Kyu Ho Park,et al.  Fast learning method for back-propagation neural network by evolutionary adaptation of learning rates , 1996, Neurocomputing.

[4]  Frédéric Gruau,et al.  Genetic Synthesis of Modular Neural Networks , 1993, ICGA.

[5]  Fabio Massimo Frattale Mascioli,et al.  A constructive algorithm for binary neural networks: the oil-spot algorithm , 1995, IEEE Trans. Neural Networks.

[6]  Marcus Frean,et al.  The Upstart Algorithm: A Method for Constructing and Training Feedforward Neural Networks , 1990, Neural Computation.

[7]  Risto Miikkulainen,et al.  Fast Reinforcement Learning through Eugenic Neuro-Evolution , 1999 .

[8]  David B. Fogel,et al.  Alternative Neural Network Training Methods , 1995, IEEE Expert.

[9]  William E. Hart,et al.  A Theoretical Comparison of Evolutionary Algorithms and Simulated Annealing , 1995, Evolutionary Programming.

[10]  David E. Goldberg,et al.  Genetic Algorithms in Search Optimization and Machine Learning , 1988 .

[11]  Hiroaki Kitano,et al.  Designing Neural Networks Using Genetic Algorithms with Graph Generation System , 1990, Complex Syst..

[12]  Thomas Ragg,et al.  Automatic determination of optimal network topologies based on information theory and evolution , 1997, EUROMICRO 97. Proceedings of the 23rd EUROMICRO Conference: New Frontiers of Information Technology (Cat. No.97TB100167).

[13]  A. Topchy,et al.  Neural network training by means of cooperative evolutionary search , 1997 .

[14]  Robert F. Port,et al.  Fractally configured neural networks , 1991, Neural Networks.

[15]  Wolfram Schiffmann,et al.  Comparison of optimized backpropagation algorithms , 1993, ESANN.

[16]  J. Nadal,et al.  Learning in feedforward layered networks: the tiling algorithm , 1989 .

[17]  Brad Fullmer and Risto Miikkulainen Using Marker-Based Genetic Encoding Of Neural Networks To Evolve Finite-State Behaviour , 1991 .