The gradient evolution algorithm: A new metaheuristic

This study presents a new metaheuristic method that is derived from the gradient-based search method. In an exact optimization method, the gradient is used to find extreme points, as well as the optimal point. This study modifies a gradient method, and creates a metaheuristic method that uses a gradient theorem as its basic updating rule. This new method, named gradient evolution, explores the search space using a set of vectors and includes three major operators: vector updating, jumping and refreshing. Vector updating is the main updating rule in gradient evolution. The search direction is determined using the Newton-Raphson method. Vector jumping and refreshing enable this method to avoid local optima. In order to evaluate the performance of the gradient evolution method, three different experiments are conducted, using fifteen test functions. The first experiment determines the influence of parameter settings on the result. It also determines the best parameter setting. There follows a comparison between the basic and improved metaheuristic methods. The experimental results show that gradient evolution performs better than, or as well as, other methods, such as particle swarm optimization, differential evolution, an artificial bee colony and continuous genetic algorithm, for most of the benchmark problems tested.

[1]  Ronald E. Miller Optimization: Foundations and Applications , 1999 .

[2]  Jean-Marc Jézéquel,et al.  Automatic test case optimization: a bacteriologic algorithm , 2005, IEEE Software.

[3]  Dervis Karaboga,et al.  A modified Artificial Bee Colony algorithm for real-parameter optimization , 2012, Inf. Sci..

[4]  A. Griewank Generalized descent for global optimization , 1981 .

[5]  Dirk V. Arnold,et al.  Evolutionary Gradient Search Revisited , 2007, IEEE Transactions on Evolutionary Computation.

[6]  M. Hestenes,et al.  Methods of conjugate gradients for solving linear systems , 1952 .

[7]  Xin-She Yang,et al.  A New Metaheuristic Bat-Inspired Algorithm , 2010, NICSO.

[8]  Thomas Bäck,et al.  An Overview of Evolutionary Algorithms for Parameter Optimization , 1993, Evolutionary Computation.

[9]  Zong Woo Geem,et al.  A New Heuristic Optimization Algorithm: Harmony Search , 2001, Simul..

[10]  Xin-She Yang,et al.  Engineering Optimization: An Introduction with Metaheuristic Applications , 2010 .

[11]  Ron Larson,et al.  Brief calculus with applications , 1983 .

[12]  Russell C. Eberhart,et al.  A new optimizer using particle swarm theory , 1995, MHS'95. Proceedings of the Sixth International Symposium on Micro Machine and Human Science.

[13]  D. Ackley A connectionist machine for genetic hillclimbing , 1987 .

[14]  E. Ebrahimi,et al.  Self-adaptive memetic algorithm: an adaptive conjugate gradient approach , 2004, IEEE Conference on Cybernetics and Intelligent Systems, 2004..

[15]  Janez Brest,et al.  Self-Adapting Control Parameters in Differential Evolution: A Comparative Study on Numerical Benchmark Problems , 2006, IEEE Transactions on Evolutionary Computation.

[16]  Jinyu Wen,et al.  Pseudo-gradient based evolutionary programming , 2003 .

[17]  R. Eberhart,et al.  Empirical study of particle swarm optimization , 1999, Proceedings of the 1999 Congress on Evolutionary Computation-CEC99 (Cat. No. 99TH8406).

[18]  P. N. Suganthan,et al.  Differential Evolution Algorithm With Strategy Adaptation for Global Numerical Optimization , 2009, IEEE Transactions on Evolutionary Computation.

[19]  Xinghuo Yu,et al.  Enhanced evolutionary programming for function optimization , 1998, 1998 IEEE International Conference on Evolutionary Computation Proceedings. IEEE World Congress on Computational Intelligence (Cat. No.98TH8360).

[20]  Pablo Moscato,et al.  3 Memetic Algorithms , 2004 .

[21]  Patrick Siarry,et al.  Genetic and Nelder-Mead algorithms hybridized for a more accurate global optimization of continuous multiminima functions , 2003, Eur. J. Oper. Res..

[22]  Tjalling J. Ypma,et al.  Historical Development of the Newton-Raphson Method , 1995, SIAM Rev..

[23]  Erwie Zahara,et al.  A hybrid genetic algorithm and particle swarm optimization for multimodal functions , 2008, Appl. Soft Comput..

[24]  Ralf Salomon,et al.  Evolutionary algorithms and gradient search: similarities and differences , 1998, IEEE Trans. Evol. Comput..

[25]  John A. Nelder,et al.  A Simplex Method for Function Minimization , 1965, Comput. J..

[26]  Andries Petrus Engelbrecht,et al.  Self-adaptive Differential Evolution , 2005, CIS.

[27]  Ponnuthurai N. Suganthan,et al.  Constrained real parameter optimization with a gradient repair based Differential Evolution algorithm , 2011, 2011 IEEE Symposium on Differential Evolution (SDE).

[28]  M. J. Mahjoob,et al.  A novel meta-heuristic optimization algorithm inspired by group hunting of animals: Hunting search , 2010, Comput. Math. Appl..

[29]  Rainer Storn,et al.  Differential Evolution – A Simple and Efficient Heuristic for global Optimization over Continuous Spaces , 1997, J. Glob. Optim..

[30]  Hamed Shah-Hosseini,et al.  Problem solving by intelligent water drops , 2007, 2007 IEEE Congress on Evolutionary Computation.

[31]  Francisco Herrera,et al.  A study on the use of non-parametric tests for analyzing the evolutionary algorithms’ behaviour: a case study on the CEC’2005 Special Session on Real Parameter Optimization , 2009, J. Heuristics.

[32]  Patrick Siarry,et al.  A hybrid method combining continuous tabu search and Nelder-Mead simplex algorithms for the global optimization of multiminima functions , 2005, Eur. J. Oper. Res..

[33]  Anthony Chen,et al.  Constraint handling in genetic algorithms using a gradient-based repair method , 2006, Comput. Oper. Res..

[34]  K. Dejong,et al.  An analysis of the behavior of a class of genetic adaptive systems , 1975 .

[35]  Chiha Ibtissem,et al.  A hybrid method based on conjugate gradient trained neural network and differential evolution for non linear systems identification , 2013, 2013 International Conference on Electrical Engineering and Software Applications.

[36]  Yanjun Shen,et al.  A Modified Niche Genetic Algorithm Based on Evolution Gradient and Its Simulation Analysis , 2007, Third International Conference on Natural Computation (ICNC 2007).

[37]  Marco Locatelli,et al.  A Note on the Griewank Test Function , 2003, J. Glob. Optim..

[38]  R. Storn,et al.  Differential Evolution: A Practical Approach to Global Optimization (Natural Computing Series) , 2005 .

[39]  Xin Yao,et al.  Evolutionary programming made faster , 1999, IEEE Trans. Evol. Comput..

[40]  John H. Holland,et al.  Outline for a Logical Theory of Adaptive Systems , 1962, JACM.

[41]  Patrick Siarry,et al.  A Continuous Genetic Algorithm Designed for the Global Optimization of Multimodal Functions , 2000, J. Heuristics.

[42]  Jing J. Liang,et al.  Comprehensive learning particle swarm optimizer for global optimization of multimodal functions , 2006, IEEE Transactions on Evolutionary Computation.

[43]  Dervis Karaboga,et al.  AN IDEA BASED ON HONEY BEE SWARM FOR NUMERICAL OPTIMIZATION , 2005 .

[44]  Marco Dorigo,et al.  The ant colony optimization meta-heuristic , 1999 .

[45]  U. P. Verma,et al.  Numerical Computational Methods , 2006 .

[46]  Dervis Karaboga,et al.  A powerful and efficient algorithm for numerical function optimization: artificial bee colony (ABC) algorithm , 2007, J. Glob. Optim..

[47]  H. H. Rosenbrock,et al.  An Automatic Method for Finding the Greatest or Least Value of a Function , 1960, Comput. J..

[48]  Mokhtar S. Bazaraa,et al.  Nonlinear Programming: Theory and Algorithms , 1993 .