Improved particle swarm optimizer based on adaptive random learning approach

In the later period of optimization by particle swarm optimization (PSO) algorithm, the diversity scarcity of population easily causes the algorithm fall into the local optimum. Therefore, an improved PSO (IPSO) algorithm is presented, in which each particle has the ability of keeping its inertia motion and learning from another randomly selected particle with better performance; moreover, for the particle with better performance, the inertia weight will be larger and the learning coefficient will be smaller. Thus, for the particles sorted in order of decreasing performance, the inertia weights are decreased and the learning rate coefficients are increased gradually. The new learning approach develops the diversity of the population, while the new parameters setting approach develops the adaptability of the population. Comparison results with the basic PSO on the examination of some well-known benchmark functions show that the IPSO algorithm has higher searching speed and stronger global searching ability.

[1]  José Neves,et al.  The fully informed particle swarm: simpler, maybe better , 2004, IEEE Transactions on Evolutionary Computation.

[2]  R. Eberhart,et al.  Comparing inertia weights and constriction factors in particle swarm optimization , 2000, Proceedings of the 2000 Congress on Evolutionary Computation. CEC00 (Cat. No.00TH8512).

[3]  Visakan Kadirkamanathan,et al.  Stability analysis of the particle dynamics in particle swarm optimizer , 2006, IEEE Transactions on Evolutionary Computation.

[4]  James Kennedy,et al.  Particle swarm optimization , 1995, Proceedings of ICNN'95 - International Conference on Neural Networks.

[5]  Russell C. Eberhart,et al.  Population diversity of particle swarms , 2008, 2008 IEEE Congress on Evolutionary Computation (IEEE World Congress on Computational Intelligence).

[6]  Maurice Clerc,et al.  The particle swarm - explosion, stability, and convergence in a multidimensional complex space , 2002, IEEE Trans. Evol. Comput..

[7]  Riccardo Poli,et al.  Particle swarm optimization , 1995, Swarm Intelligence.

[8]  Russell C. Eberhart,et al.  An analysis of Bare Bones Particle Swarm , 2008, 2008 IEEE Swarm Intelligence Symposium.

[9]  Russell C. Eberhart,et al.  Parameter Selection in Particle Swarm Optimization , 1998, Evolutionary Programming.

[10]  M. Clerc,et al.  The swarm and the queen: towards a deterministic and adaptive particle swarm optimization , 1999, Proceedings of the 1999 Congress on Evolutionary Computation-CEC99 (Cat. No. 99TH8406).

[11]  Yuanyuan Liu,et al.  An Adaptive Particle Swarm Optimization for Global Optimization , 2007, Third International Conference on Natural Computation (ICNC 2007).

[12]  Yuanyuan Liu,et al.  Benchmark Tests of Robust Modified Particle Swarm Optimization , 2007, ICNC.

[13]  Zhou Gu,et al.  A Novel Memetic Algorithm for Global Optimization Based on PSO and SFLA , 2007, ISICA.

[14]  J. Kennedy,et al.  Population structure and particle swarm performance , 2002, Proceedings of the 2002 Congress on Evolutionary Computation. CEC'02 (Cat. No.02TH8600).

[15]  Russell C. Eberhart,et al.  A new optimizer using particle swarm theory , 1995, MHS'95. Proceedings of the Sixth International Symposium on Micro Machine and Human Science.

[16]  James Kennedy,et al.  Bare bones particle swarms , 2003, Proceedings of the 2003 IEEE Swarm Intelligence Symposium. SIS'03 (Cat. No.03EX706).

[17]  Riccardo Poli,et al.  Markov chain models of bare-bones particle swarm optimizers , 2007, GECCO '07.