A novel particle swarm optimization algorithm with stochastic focusing search for real-parameter optimization

Basic particle swarm optimization (PSO) algorithms are susceptible to being trapped into local optimum and premature convergence. A novel stochastic focusing search (SFS) based PSO algorithm with adaptively dynamic neighborhoods topology and subpopulation strategy is proposed. SFS is based on simulating the act of human randomized searching behaviors by using the adaptively dynamic neighborhoods topology. With subpopulation strategy, SFS can improve the global searching ability, keeping the diversity and escaping local extremum. The algorithm¿s performance is studied by using a challenging set of typically complex functions with comparison of differential evolution (DE) and three modified PSO algorithms. The simulation results show that SFS is competitive to solve most parts of the benchmark problems and will become a promising candidate of search algorithms especially when the existing algorithms have difficulties in solving some problems.

[1]  John H. Holland,et al.  Adaptation in Natural and Artificial Systems: An Introductory Analysis with Applications to Biology, Control, and Artificial Intelligence , 1992 .

[2]  Rainer Storn,et al.  Differential Evolution – A Simple and Efficient Heuristic for global Optimization over Continuous Spaces , 1997, J. Glob. Optim..

[3]  R. Storn,et al.  Differential Evolution - A simple and efficient adaptive scheme for global optimization over continuous spaces , 2004 .

[4]  James Kennedy,et al.  Particle swarm optimization , 2002, Proceedings of ICNN'95 - International Conference on Neural Networks.

[5]  D A Pierre,et al.  Optimization Theory with Applications , 1986 .

[6]  Xin Yao,et al.  Evolutionary programming made faster , 1999, IEEE Trans. Evol. Comput..

[7]  Mathias Kern,et al.  Parameter Adaptation in Heuristic Search { A Population-Based Approach { , 2006 .

[8]  Ian F. C. Smith,et al.  A direct stochastic algorithm for global search , 2003, Appl. Math. Comput..

[9]  Mathias Kern Parameter adaptationn in heuristic search : a population based approach , 2006 .

[10]  Fred W. Glover,et al.  Future paths for integer programming and links to artificial intelligence , 1986, Comput. Oper. Res..

[11]  Graham Kendall,et al.  Hyper-Heuristics: An Emerging Direction in Modern Search Technology , 2003, Handbook of Metaheuristics.

[12]  Kristina Lerman,et al.  A General Methodology for Mathematical Analysis of Multi-Agent Systems , 2001 .

[13]  Sandro Ridella,et al.  Minimizing multimodal functions of continuous variables with the “simulated annealing” algorithmCorrigenda for this article is available here , 1987, TOMS.

[14]  Jing J. Liang,et al.  Comprehensive learning particle swarm optimizer for global optimization of multimodal functions , 2006, IEEE Transactions on Evolutionary Computation.

[15]  Michael W. Trosset,et al.  I Know It When I See It: Toward a Definition of Direct Search Methods , 1996 .

[16]  Zbigniew Michalewicz,et al.  Evolutionary Computation at the Edge of Feasibility , 1996, PPSN.

[17]  Maurice Clerc,et al.  The particle swarm - explosion, stability, and convergence in a multidimensional complex space , 2002, IEEE Trans. Evol. Comput..

[18]  C. D. Gelatt,et al.  Optimization by Simulated Annealing , 1983, Science.

[19]  Margaret H. Wright,et al.  Direct search methods: Once scorned, now respectable , 1996 .

[20]  Peter Norvig,et al.  Artificial Intelligence: A Modern Approach , 1995 .

[21]  E Bonabeau,et al.  Swarm Intelligence: A Whole New Way to Think about Business , 2001 .

[22]  R. Eberhart,et al.  Empirical study of particle swarm optimization , 1999, Proceedings of the 1999 Congress on Evolutionary Computation-CEC99 (Cat. No. 99TH8406).