A Performance Comparison of Crossover Variations in Differential Evolution for Training Multi-layer Perceptron Neural Networks

Artificial neural networks (ANNs) are a kind of well-known machine learning techniques, and it is required to adjust the weights of their neurons to learn a given task, which usually done by using a gradient-based optimization algorithm. However, gradient-based optimization algorithms likely get stuck in a local optimum, and therefore, researchers have attempted to apply population-based metaheuristics. In this paper, we study the performance comparison of various crossover operators in differential evolution (DE) for training ANNs. We investigated the classification performance of three crossover operators, the binomial crossover, the exponential crossover, and the multiple exponential recombination (MER), with medical datasets. The experimental results show that the binomial crossover and the MER have better performance compared with the exponential crossover, and the exponential crossover varies significantly in performance depending on the architecture. Also, we found that dependent variables in training ANNs may not be located proximately each other, which results in makes the advantage of the exponential crossover and the MER effectless.

[1]  P. N. Suganthan,et al.  Differential Evolution: A Survey of the State-of-the-Art , 2011, IEEE Transactions on Evolutionary Computation.

[2]  Tae Jong Choi,et al.  An Adaptive Population Resizing Scheme for Differential Evolution in Numerical Optimization , 2015 .

[3]  Daniela Zaharie,et al.  Influence of crossover on the behavior of Differential Evolution Algorithms , 2009, Appl. Soft Comput..

[4]  Tae Jong Choi,et al.  Artificial life based on boids model and evolutionary chaotic neural networks for creating artworks , 2017, Swarm Evol. Comput..

[5]  Václav Snásel,et al.  Metaheuristic design of feedforward neural networks: A review of two decades of research , 2017, Eng. Appl. Artif. Intell..

[6]  Tae Jong Choi,et al.  An Adaptive Differential Evolution Algorithm with Automatic Population Resizing for Global Numerical Optimization , 2014, BIC-TA.

[7]  Rawaa Dawoud Al-Dabbagh,et al.  Algorithmic design issues in adaptive differential evolution schemes: Review and taxonomy , 2018, Swarm Evol. Comput..

[8]  Rainer Storn,et al.  Differential Evolution – A Simple and Efficient Heuristic for global Optimization over Continuous Spaces , 1997, J. Glob. Optim..

[9]  Adam P. Piotrowski,et al.  Review of Differential Evolution population size , 2017, Swarm Evol. Comput..

[10]  Tae Jong Choi,et al.  Accelerating differential evolution using multiple exponential cauchy mutation , 2018, GECCO.

[11]  Adam Slowik,et al.  Application of an Adaptive Differential Evolution Algorithm With Multiple Trial Vectors to Artificial Neural Network Training , 2011, IEEE Transactions on Industrial Electronics.

[12]  Arthur C. Sanderson,et al.  JADE: Adaptive Differential Evolution With Optional External Archive , 2009, IEEE Transactions on Evolutionary Computation.

[13]  Jinung An,et al.  An Adaptive Cauchy Differential Evolution Algorithm for Global Numerical Optimization , 2013, TheScientificWorldJournal.

[14]  Surya Ganguli,et al.  Identifying and attacking the saddle point problem in high-dimensional non-convex optimization , 2014, NIPS.

[15]  Tae Jong Choi,et al.  Asynchronous differential evolution with selfadaptive parameter control for global numerical optimization , 2018 .

[16]  Tae Jong Choi,et al.  Adaptive α-stable differential evolution in numerical optimization , 2017, Natural Computing.

[17]  Millie Pant,et al.  Improving the performance of differential evolution algorithm using Cauchy mutation , 2011, Soft Comput..

[18]  P. N. Suganthan,et al.  Differential Evolution Algorithm With Strategy Adaptation for Global Numerical Optimization , 2009, IEEE Transactions on Evolutionary Computation.

[19]  Janez Brest,et al.  Self-Adapting Control Parameters in Differential Evolution: A Comparative Study on Numerical Benchmark Problems , 2006, IEEE Transactions on Evolutionary Computation.

[20]  Adam P. Piotrowski,et al.  Differential Evolution algorithms applied to Neural Network training suffer from stagnation , 2014, Appl. Soft Comput..

[21]  Tae Jong Choi,et al.  An Adaptive Cauchy Differential Evolution Algorithm with Population Size Reduction and Modified Multiple Mutation Strategies , 2015 .

[22]  Mikhail Zhabitsky,et al.  Asynchronous Differential Evolution , 2011, MMCP.

[23]  Ponnuthurai N. Suganthan,et al.  Recent advances in differential evolution - An updated survey , 2016, Swarm Evol. Comput..

[24]  Kay Chen Tan,et al.  Multiple Exponential Recombination for Differential Evolution , 2017, IEEE Transactions on Cybernetics.

[25]  Tae Jong Choi,et al.  Adaptive Cauchy Differential Evolution with Strategy Adaptation and Its Application to Training Large-Scale Artificial Neural Networks , 2017, BIC-TA.

[26]  Tae Jong Choi,et al.  An Adaptive Cauchy Differential Evolution Algorithm with Bias Strategy Adaptation Mechanism for Global Numerical Optimization , 2014, J. Comput..

[27]  Joni-Kristian Kämäräinen,et al.  Differential Evolution Training Algorithm for Feed-Forward Neural Networks , 2003, Neural Processing Letters.