Two-stage hybrid tuning algorithm for training neural networks in image vision applications

In the present paper a comparative study of two possible combinations of the Backpropagation (BP) and a Genetic Algorithm (GA), for Neural Networks training is performed. The performance of these approaches is compared to each other and to each algorithm incorporated separately in the training procedure. The construction of hybrid optimisation algorithms is originated from the need to manipulate and solve difficult optimisation problems by combining their advantages. The locality and globality behaviour of BP and GA is investigated by the presented hybrid structures, by applying them in five popular benchmark problems. In a second phase, the most efficient of these hybrid algorithms is used, in a typical pattern recognition task. It is concluded, that a more sophisticated structure based on the collaboration of two powerful optimisation algorithms can be used to train a typical neural network more efficiently.

[1]  George Cybenko,et al.  Approximation by superpositions of a sigmoidal function , 1992, Math. Control. Signals Syst..

[2]  Lawrence Davis,et al.  Training Feedforward Neural Networks Using Genetic Algorithms , 1989, IJCAI.

[3]  Carl G. Looney,et al.  Pattern recognition using neural networks: theory and algorithms for engineers and scientists , 1997 .

[4]  Jong Beom Ra,et al.  Weight value initialization for improving training speed in the backpropagation network , 1991, [Proceedings] 1991 IEEE International Joint Conference on Neural Networks.

[5]  Marco Gori,et al.  A survey of hybrid ANN/HMM models for automatic speech recognition , 2001, Neurocomputing.

[6]  Christian W. Dawson,et al.  A review of genetic algorithms applied to training radial basis function networks , 2004, Neural Computing & Applications.

[7]  Simon Haykin,et al.  Neural Networks: A Comprehensive Foundation , 1998 .

[8]  Melanie Mitchell,et al.  An introduction to genetic algorithms , 1996 .

[9]  Emile Fiesler,et al.  High-order and multilayer perceptron initialization , 1997, IEEE Trans. Neural Networks.

[10]  Michael D. Lemmon,et al.  Hybrid interior point training of modular neural networks , 1998, Neural Networks.

[11]  Scott E. Fahlman,et al.  An empirical study of learning speed in back-propagation networks , 1988 .

[12]  Patrice Aknin,et al.  Hybrid training of radial basis function networks in a partitioning context of classification , 1999, Neurocomputing.

[13]  Qinghua Zhang,et al.  Wavelet networks , 1992, IEEE Trans. Neural Networks.

[14]  Hualou Liang,et al.  Improvement of Cascade Correlation Learning Algorithm with an Evolutionary Initialization , 1998, Inf. Sci..

[15]  A. P. Russo Constrained neural networks for recognition of passive sonar signals using shape , 1991, [1991 Proceedings] IEEE Conference on Neural Networks for Ocean Engineering.

[16]  Simon Haykin Adaptive digital communication receivers , 2000 .

[17]  Basil G. Mertzios,et al.  A neural multiclassifier system for object recognition in robotic vision applications , 2004 .

[18]  David Coley,et al.  Introduction to Genetic Algorithms for Scientists and Engineers , 1999 .

[19]  Karim Faez,et al.  A fuzzy hybrid learning algorithm for radial basis function neural network with application in human face recognition , 2003, Pattern Recognit..

[20]  Kurt Hornik,et al.  Multilayer feedforward networks are universal approximators , 1989, Neural Networks.

[21]  Bernard Widrow,et al.  Improving the learning speed of 2-layer neural networks by choosing initial values of the adaptive weights , 1990, 1990 IJCNN International Joint Conference on Neural Networks.

[22]  D. Roviras,et al.  Comparison of neural network adaptive predistortion techniques for satellite down links , 2001, IJCNN'01. International Joint Conference on Neural Networks. Proceedings (Cat. No.01CH37222).