Hybrid Artificial Bee Colony algorithm for neural network training

A hybrid algorithm combining Artificial Bee Colony (ABC) algorithm with Levenberq-Marquardt (LM) algorithm is introduced to train artificial neural networks (ANN). Training an ANN is an optimization task where the goal is to find optimal weight set of the network in training process. Traditional training algorithms might get stuck in local minima and the global search techniques might catch global minima very slow. Therefore, hybrid models combining global search algorithms and conventional techniques are employed to train neural networks. In this work, ABC algorithm is hybridized with the LM algorithm to apply training neural networks.

[1]  Brijesh Verma,et al.  A novel evolutionary neural learning algorithm , 2002, Proceedings of the 2002 Congress on Evolutionary Computation. CEC'02 (Cat. No.02TH8600).

[2]  Mohammad Bagher Menhaj,et al.  Training feedforward networks with the Marquardt algorithm , 1994, IEEE Trans. Neural Networks.

[3]  Derviş Karaboğa,et al.  NEURAL NETWORKS TRAINING BY ARTIFICIAL BEE COLONY ALGORITHM ON PATTERN CLASSIFICATION , 2009 .

[4]  Simon Haykin,et al.  Neural Networks: A Comprehensive Foundation , 1998 .

[5]  Judith E. Dayhoff,et al.  Neural Network Architectures: An Introduction , 1989 .

[6]  Teresa Bernarda Ludermir,et al.  Hybrid Training of Feed-Forward Neural Networks with Particle Swarm Optimization , 2006, ICONIP.

[7]  Scott E. Fahlman,et al.  An empirical study of learning speed in back-propagation networks , 1988 .

[8]  Michael R. Lyu,et al.  A hybrid particle swarm optimization-back-propagation algorithm for feedforward neural network training , 2007, Appl. Math. Comput..

[9]  Vicenç Torra,et al.  Modeling Decisions for Artificial Intelligence, Second International Conference, MDAI 2005, Tsukuba, Japan, July 25-27, 2005, Proceedings , 2005, MDAI.

[10]  Bahram Alidaee,et al.  Global optimization for artificial neural networks: A tabu search application , 1998, Eur. J. Oper. Res..

[11]  Christian Blum,et al.  Training feed-forward neural networks with ant colony optimization: an application to pattern classification , 2005, Fifth International Conference on Hybrid Intelligent Systems (HIS'05).

[12]  Michael R. Lyu,et al.  A hybrid particle swarm optimization-back-propagation algorithm for feedforward neural network training , 2007, Appl. Math. Comput..

[13]  Thomas Bäck,et al.  An Overview of Evolutionary Algorithms for Parameter Optimization , 1993, Evolutionary Computation.

[14]  Xin Yao,et al.  Evolving artificial neural networks , 1999, Proc. IEEE.

[15]  Dervis Karaboga,et al.  A novel clustering approach: Artificial Bee Colony (ABC) algorithm , 2011, Appl. Soft Comput..

[16]  Hong Wang,et al.  Application of Artificial Neural Network Supported by BP and Particle Swarm Optimization Algorithm for Evaluating the Criticality Class of Spare Parts , 2007, Third International Conference on Natural Computation (ICNC 2007).

[17]  Geoffrey E. Hinton,et al.  Learning representations by back-propagating errors , 1986, Nature.

[18]  Enrique Alba,et al.  Training Neural Networks with GA Hybrid Algorithms , 2004, GECCO.

[19]  Randall S. Sexton,et al.  Optimization of neural networks: A comparative analysis of the genetic algorithm and simulated annealing , 1999, Eur. J. Oper. Res..

[20]  Dervis Karaboga,et al.  AN IDEA BASED ON HONEY BEE SWARM FOR NUMERICAL OPTIMIZATION , 2005 .

[21]  Teresa Bernarda Ludermir,et al.  An Optimization Methodology for Neural Network Weights and Architectures , 2006, IEEE Transactions on Neural Networks.

[22]  S. Das Elements Of Artificial Neural Networks [Book Reviews] , 1998, IEEE Transactions on Neural Networks.

[23]  Tamás D. Gedeon,et al.  Simulated annealing and weight decay in adaptive learning: the SARPROP algorithm , 1998, IEEE Trans. Neural Networks.