Stepping away from maximizers of concave quadratics in random line search

Random Lines (RL) search relies on finding a minimizer of a given cost function along randomly selected lines in the function domain. Once three points along each line are identified, a quadratic function passing through these points is determined and the minimum of the function is used whenever the function is convex. This paper proposes a two-step approach for handling concave cases: (1) starting from a point with the smallest function value and then (2) stepping in the direction away from the maximizer of the quadratic function. Promising numerical results comparing the improved RL method with other similar evolutionary methods are presented.

[1]  A. Bagheri,et al.  Interval Search with Quadratic Interpolation and Stable Deviation Quantum-Behaved Particle Swarm Optimization (IQS-QPSO) , 2019, The International Journal of Multiphysics.

[2]  Tamara G. Kolda,et al.  Optimization by Direct Search: New Perspectives on Some Classical and Modern Methods , 2003, SIAM Rev..

[3]  David J. Thuente,et al.  Line search algorithms with guaranteed sufficient decrease , 1994, TOMS.

[4]  John E. Dennis,et al.  Numerical methods for unconstrained optimization and nonlinear equations , 1983, Prentice Hall series in computational mathematics.

[5]  Ajith Abraham,et al.  New mutation schemes for differential evolution algorithm and their application to the optimization of directional over-current relay settings , 2010, Appl. Math. Comput..

[6]  Ismet Sahin,et al.  Random Lines: A Novel Population Set-Based Evolutionary Global Optimization Algorithm , 2011, EuroGP.

[7]  Ismet Sahin Random Lines: A New Population Set-based Global Optimization Algorithm Based on Quadratic Models | NIST , 2011 .

[8]  Xin Yao,et al.  Evolutionary programming made faster , 1999, IEEE Trans. Evol. Comput..

[9]  Aimo A. Törn,et al.  Global Optimization , 1999, Science.

[10]  M.M.A. Salama,et al.  Opposition-Based Differential Evolution , 2008, IEEE Transactions on Evolutionary Computation.

[11]  I. Sahin Minimization over randomly selected lines , 2013 .

[12]  Anupam Yadav,et al.  Convergence of Gravitational Search Algorithm on Linear and Quadratic Functions , 2018, Decision Science in Action.

[13]  Jing J. Liang,et al.  Problem Definitions and Evaluation Criteria for the CEC 2005 Special Session on Real-Parameter Optimization , 2005 .

[14]  Arthur C. Sanderson,et al.  JADE: Adaptive Differential Evolution With Optional External Archive , 2009, IEEE Transactions on Evolutionary Computation.

[15]  Bin Xu,et al.  Quadratic interpolation based teaching-learning-based optimization for chemical dynamic system optimization , 2018, Knowl. Based Syst..

[16]  Deepak Sharma,et al.  Hybridizing Evolutionary Multi-objective Algorithm Using Random Mutations and Local Searches , 2019 .

[17]  M. Montaz Ali,et al.  Population set-based global optimization algorithms: some modifications and numerical studies , 2004, Comput. Oper. Res..

[18]  Yongjun Sun,et al.  A whale optimization algorithm based on quadratic interpolation for high-dimensional global optimization problems , 2019, Appl. Soft Comput..

[19]  Rozaida Ghazali,et al.  Hybrid of firefly algorithm and pattern search for solving optimization problems , 2018, Evol. Intell..

[20]  C. D. Gelatt,et al.  Optimization by Simulated Annealing , 1983, Science.

[21]  David G. Luenberger,et al.  Linear and nonlinear programming , 1984 .

[22]  Rainer Storn,et al.  Differential Evolution – A Simple and Efficient Heuristic for global Optimization over Continuous Spaces , 1997, J. Glob. Optim..

[23]  Zhijian Wu,et al.  Enhanced opposition-based differential evolution for solving high-dimensional continuous optimization problems , 2011, Soft Comput..

[24]  Dervis Karaboga,et al.  A comparative study of Artificial Bee Colony algorithm , 2009, Appl. Math. Comput..

[25]  M. M. Ali,et al.  Improved particle swarm algorithms for global optimization , 2008, Appl. Math. Comput..