Evolving Artificial Neural Networks Using Butterfly Optimization Algorithm for Data Classification

One of the most difficult challenges in machine learning is the training process of artificial neural networks, which is mainly concerned with determining the best set of weights and biases. Gradient descent techniques are known as the most popular training algorithms. However, they are susceptible to local optima and slow convergence in training. Therefore, several stochastic optimization algorithms have been proposed in the literature to alleviate the shortcomings of gradient descent approaches. The butterfly optimization algorithm (BOA) is a recently proposed meta-heuristic approach. Its inspiration is based on the food foraging behavior of butterflies in the nature. Moreover, it has been shown that BOA is effective in undertaking a wide range of optimization problems and attaining the global optima solutions. In this paper, a new classification method based on the combination of artificial neural networks and BOA algorithm is proposed. To this end, BOA is applied as a new training strategy by optimizing the weights and biases of artificial neural networks. This leads to improving the convergence speed and also reducing the risk of falling into local optima. The proposed classification method is compared with other state-of-the-art methods based on two well-known data sets and different evaluation measures. The experimental results ascertain the superiority of the proposed method in comparison with the other methods.

[1]  João Paulo Papa,et al.  Automatic identification of epileptic EEG signals through binary magnetic optimization algorithms , 2017, Neural Computing and Applications.

[2]  Iman Raeesi Vanani,et al.  Analytical evaluation of emerging scientific trends in business intelligence through the utilisation of burst detection algorithm , 2017 .

[3]  Hossam Faris,et al.  Optimizing connection weights in neural networks using the whale optimization algorithm , 2016, Soft Computing.

[4]  Han Woo Park,et al.  Conversations about Open Data on Twitter , 2017 .

[5]  Sajad Ahmadian,et al.  Training back propagation neural networks using asexual reproduction optimization , 2015, 2015 7th Conference on Information and Knowledge Technology (IKT).

[6]  Sérgio Moro,et al.  A comparative analysis of classifiers in cancer prediction using multiple data mining techniques , 2017 .

[7]  Saeid Nahavandi,et al.  An efficient Neuroevolution Approach for Heart Disease Detection , 2019, 2019 IEEE International Conference on Systems, Man and Cybernetics (SMC).

[8]  Siti Zaiton Mohd Hashim,et al.  Training feedforward neural networks using hybrid particle swarm optimization and gravitational search algorithm , 2012, Appl. Math. Comput..

[9]  Hossam Faris,et al.  Improved monarch butterfly optimization for unconstrained global search and neural network training , 2018, Applied Intelligence.

[10]  Hossam Faris,et al.  Binary grasshopper optimisation algorithm approaches for feature selection problems , 2019, Expert Syst. Appl..

[11]  Hossam Faris,et al.  Training feedforward neural networks using multi-verse optimizer for binary classification problems , 2016, Applied Intelligence.

[12]  Sérgio Moro,et al.  Analytical assessment process of e-learning domain research between 1980 and 2014 , 2018 .

[13]  Andrew Lewis,et al.  Let a biogeography-based optimizer train your Multi-Layer Perceptron , 2014, Inf. Sci..

[14]  Nurettin Çetinkaya,et al.  Flower pollination–feedforward neural network for load flow forecasting in smart distribution grid , 2018, Neural Computing and Applications.

[15]  Saeid Nahavandi,et al.  Intelligent Animal Fiber Classification with Artificial Neural Networks , 2002 .

[16]  Laurence T. Yang,et al.  A survey on deep learning for big data , 2018, Inf. Fusion.

[17]  Shikha Agrawal,et al.  Neural Network Techniques for Cancer Prediction: A Survey , 2015, KES.

[18]  Saeid Nahavandi,et al.  Autonomous Robot Navigation System Using the Evolutionary Multi-Verse optimizer Algorithm , 2019, 2019 IEEE International Conference on Systems, Man and Cybernetics (SMC).

[19]  Seyed Mohammad Mirjalili How effective is the Grey Wolf optimizer in training multi-layer perceptrons , 2014, Applied Intelligence.

[20]  Saeid Nahavandi,et al.  Improving load forecast accuracy by clustering consumers using smart meter data , 2015, 2015 International Joint Conference on Neural Networks (IJCNN).

[21]  Kwok-wing Chau,et al.  Particle Swarm Optimization Training Algorithm for ANNs in Stage Prediction of Shing Mun River , 2006 .

[22]  Iman Raeesi Vanani,et al.  A comparative analysis of emerging scientific themes in business analytics , 2018, Int. J. Bus. Inf. Syst..

[23]  Randall S. Sexton,et al.  Comparing backpropagation with a genetic algorithm for neural network training , 1999 .

[24]  Sayed Farhad Mousavi,et al.  Modeling of Fixed-Bed Column System of Hg(II) Ions on Ostrich Bone Ash/nZVI Composite by Artificial Neural Network , 2017 .

[25]  Han Woo Park,et al.  State of the art in business analytics: themes and collaborations , 2018 .

[26]  David H. Wolpert,et al.  No free lunch theorems for optimization , 1997, IEEE Trans. Evol. Comput..

[27]  Satvir Singh,et al.  Butterfly optimization algorithm: a novel approach for global optimization , 2018, Soft Computing.