A boosting approach based on bat optimization in MLP neural networks: Classification task

Achieving optimal weights of artificial neural networks (ANN) is crucial issue. Some approaches have attempted to obtain efficient weights using metaheuristic algorithms. In this paper, we utilize bat optimization algorithm and its modifications to gain optimal weights of multilayer perceptron neural networks (MLP) applied in adaboost algorithm as weak classifiers. Different modifications of bat algorithm improve the exploration and exploitation capability to optimize (minimize) mean square error (MSE) of MLP. Bat algorithm is based on the echolocation behavior of real bats and it derives some benefits of population based and local search algorithms. Experimental results illustrate that the proposed hybrid approach improves the detection rate of MLP neural networks for classification task. Also, Wilcoxon test shows that this approach is more efficient than some similar base researches in this problems.

[1]  Qi Changxing,et al.  A hybrid particle swarm optimization algorithm , 2017, 2017 3rd IEEE International Conference on Computer and Communications (ICCC).

[2]  Rafael Martí,et al.  Multilayer neural networks: an experimental evaluation of on-line training methods , 2004, Comput. Oper. Res..

[3]  Abdul Razak Hamdan,et al.  Optimization of neural network model using modified bat-inspired algorithm , 2015, Appl. Soft Comput..

[4]  Janez Demsar,et al.  Statistical Comparisons of Classifiers over Multiple Data Sets , 2006, J. Mach. Learn. Res..

[5]  M.T. Hagan,et al.  Global optimization of neural network weights , 2002, Proceedings of the 2002 International Joint Conference on Neural Networks. IJCNN'02 (Cat. No.02CH37290).

[6]  L. Darrell Whitley,et al.  Genetic algorithms and neural networks: optimizing connections and connectivity , 1990, Parallel Comput..

[7]  Zenglin Xu,et al.  Feature Selection with Particle Swarms , 2004, CIS.

[8]  Ferat Sahin,et al.  A survey on feature selection methods , 2014, Comput. Electr. Eng..

[9]  Michael R. Lyu,et al.  A hybrid particle swarm optimization-back-propagation algorithm for feedforward neural network training , 2007, Appl. Math. Comput..

[10]  Teresa Bernarda Ludermir,et al.  Hybrid Training Method for MLP: Optimization of Architecture and Training , 2011, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[11]  Teresa Bernarda Ludermir,et al.  An Optimization Methodology for Neural Network Weights and Architectures , 2006, IEEE Transactions on Neural Networks.

[12]  Xin-She Yang,et al.  A New Metaheuristic Bat-Inspired Algorithm , 2010, NICSO.