Performance analysis: Differential search algorithm based on randomization and benchmark functions

PurposeDifferential search algorithm (DSA) is a new optimization, meta-heuristic algorithm. It simulates the Brownian-like, random-walk movement of an organism by migrating to a better position. The purpose of this paper is to analyze the performance analysis of DSA into two key parts: six random number generators (RNGs) and Benchmark functions (BMF) from IEEE World Congress on Evolutionary Computation (CEC, 2015). Noting that this study took problem dimensionality and maximum function evaluation (MFE) into account, various configurations were executed to check the parameters’ influence. Shifted rotated Rastrigin’s functions provided the best outcomes for the majority of RNGs, and minimum dimensionality offered the best average. Among almost all BMFs studied, Weibull and Beta RNGs concluded with the best and worst averages, respectively. In sum, 50,000 MFE provided the best results with almost RNGs and BMFs.Design/methodology/approachDSA was tested under six randomizers (Bernoulli, Beta, Binomial, Chisquare, Rayleigh, Weibull), two unimodal functions (rotated high conditioned elliptic function, rotated cigar function), three simple multi-modal functions (shifted rotated Ackley’s, shifted rotated Rastrigin’s, shifted rotated Schwefel’s functions) and three hybrid Functions (Hybrid Function 1 (n=3), Hybrid Function 2 (n=4,and Hybrid Function 3 (n=5)) at four problem dimensionalities (10D, 30D, 50D and 100D). According to the protocol of the CEC (2015) testbed, the stopping criteria are the MFEs, which are set to 10,000, 50,000 and 100,000. All algorithms mentioned were implemented on PC running Windows 8.1, i5 CPU at 1.60 GHz, 2.29 GHz and a 64-bit operating system.FindingsThe authors concluded the results based on RNGs as follows: F3 gave the best average results with Bernoulli, whereas F4 resulted in the best outcomes with all other RNGs; minimum and maximum dimensionality offered the best and worst averages, respectively; and Bernoulli and Binomial RNGs retained the best and worst averages, respectively, when all other parameters were fixed. In addition, the authors’ results concluded, based on BMFs: Weibull and Beta RNGs produced the best and worst averages with most BMFs; shifted and rotated Rastrigin’s function and Hybrid Function 2 gave rise to the best and worst averages. In both parts, 50,000 MFEs offered the best average results with most RNGs and BMFs.Originality/valueBeing aware of the advantages and drawbacks of DS enlarges knowledge about the class in which differential evolution belongs. Application of that knowledge, to specific problems, ensures that the possible improvements are not randomly applied. Strengths and weaknesses influenced by the characteristics of the problem being solved (e.g. linearity, dimensionality) and by the internal approaches being used (e.g. stop criteria, parameter control settings, initialization procedure) are not studied in detail. In-depth study of performance under various conditions is a “must” if one desires to efficiently apply DS algorithms to help solve specific problems. In this work, all the functions were chosen from the 2015 IEEE World Congress on Evolutionary Computation (CEC, 2015).

[1]  Xiangyu Wang,et al.  A novel differential search algorithm and applications for structure design , 2015, Appl. Math. Comput..

[2]  Alper Ozpinar,et al.  Use of Chaotic Randomness Numbers: Metaheuristic and Artificial Intelligence Algorithms , 2016 .

[3]  A. E. Eiben,et al.  Parameter tuning for configuring and analyzing evolutionary algorithms , 2011, Swarm Evol. Comput..

[4]  Andrew Lewis,et al.  The Whale Optimization Algorithm , 2016, Adv. Eng. Softw..

[5]  Anthony Brabazon,et al.  The raven roosting optimisation algorithm , 2016, Soft Comput..

[6]  Cihan Kaleli,et al.  A review on deep learning for recommender systems: challenges and remedies , 2018, Artificial Intelligence Review.

[7]  Andrew Lewis,et al.  Grey Wolf Optimizer , 2014, Adv. Eng. Softw..

[8]  Alain Hertz,et al.  Guidelines for the use of meta-heuristics in combinatorial optimization , 2003, Eur. J. Oper. Res..

[9]  Introduction to Stochastic Processes , 2014 .

[10]  Messias Borges Silva,et al.  Improving the Performance of Metaheuristics: An Approach Combining Response Surface Methodology and Racing Algorithms , 2015 .

[11]  Arpan Kumar Kar,et al.  Bio inspired computing - A review of algorithms and scope of applications , 2016, Expert Syst. Appl..

[12]  Saeid Kazemzadeh Azad,et al.  Adaptive dimensional search: A new metaheuristic algorithm for discrete truss sizing optimization , 2015 .

[13]  Jing J. Liang,et al.  Problem Definitions and Evaluation Criteria for the CEC 2005 Special Session on Real-Parameter Optimization , 2005 .

[14]  István Erlich,et al.  Testing MVMO on learning-based real-parameter single objective benchmark optimization problems , 2015, 2015 IEEE Congress on Evolutionary Computation (CEC).

[15]  Kenneth Sörensen,et al.  Metaheuristics - the metaphor exposed , 2015, Int. Trans. Oper. Res..

[16]  Silvia Curteanu,et al.  The use of differential evolution algorithm for solving chemical engineering problems , 2016 .

[17]  Pinar Civicioglu,et al.  Transforming geocentric cartesian coordinates to geodetic coordinates by using differential search algorithm , 2012, Comput. Geosci..

[18]  Reza Moghdani,et al.  Volleyball Premier League Algorithm , 2018, Appl. Soft Comput..

[19]  Ali Kaveh,et al.  Advances in Metaheuristic Algorithms for Optimal Design of Structures , 2014 .

[20]  Dipayan Guha,et al.  Study of differential search algorithm based automatic generation control of an interconnected thermal-thermal system with governor dead-band , 2017, Appl. Soft Comput..

[21]  Yuhui Shi,et al.  Metaheuristic research: a comprehensive survey , 2018, Artificial Intelligence Review.

[22]  Gaurav Dhiman,et al.  Spotted hyena optimizer: A novel bio-inspired based metaheuristic technique for engineering applications , 2017, Adv. Eng. Softw..

[23]  Ivan Zelinka,et al.  Behaviour of pseudo-random and chaotic sources of stochasticity in nature-inspired optimization methods , 2014, Soft Computing.

[24]  Kerim Guney,et al.  Antenna Array Synthesis and Failure Correction Using Differential Search Algorithm , 2014 .

[25]  Miguel A. Vega-Rodríguez,et al.  Sensitiveness of Evolutionary Algorithms to the Random Number Generator , 2011, ICANNGA.

[26]  Y. Vasseghian,et al.  Effect of various formulation ingredients on thermal characteristics of PVC/clay nanocomposite foams: experimental and modeling , 2017 .

[27]  Jason Sheng-Hong Tsai,et al.  A self-optimization approach for L-SHADE incorporated with eigenvector-based crossover and successful-parent-selecting framework on CEC 2015 benchmark set , 2015, 2015 IEEE Congress on Evolutionary Computation (CEC).

[28]  John Yen,et al.  Introduction , 2004, CACM.

[29]  Ying Tan,et al.  Dynamic search fireworks algorithm with covariance mutation for solving the CEC 2015 learning based competition problems , 2015, 2015 IEEE Congress on Evolutionary Computation (CEC).

[30]  Xin-She Yang,et al.  A literature survey of benchmark functions for global optimisation problems , 2013, Int. J. Math. Model. Numer. Optimisation.

[31]  Kok Lay Teo,et al.  An exact penalty function-based differential search algorithm for constrained global optimization , 2016, Soft Comput..

[32]  Daniel Campos,et al.  Stochastic Foundations in Movement Ecology: Anomalous Diffusion, Front Propagation and Random Searches , 2013 .

[33]  Çağlayan Balkaya,et al.  Parameter estimation by Differential Search Algorithm from horizontal loop electromagnetic (HLEM) data , 2018 .

[34]  L. Guo,et al.  A self-adaptive dynamic particle swarm optimizer , 2015, 2015 IEEE Congress on Evolutionary Computation (CEC).

[35]  Adem Alpaslan Altun,et al.  The Binary Differential Search Algorithm Approach for Solving Uncapacitated Facility Location Problems , 2017 .