Simplifying Particle Swarm Optimization

The general purpose optimization method known as Particle Swarm Optimization (PSO) has received much attention in past years, with many attempts to find the variant that performs best on a wide variety of optimization problems. The focus of past research has been with making the PSO method more complex, as this is frequently believed to increase its adaptability to other optimization problems. This study takes the opposite approach and simplifies the PSO method. To compare the efficacy of the original PSO and the simplified variant here, an easy technique is presented for efficiently tuning their behavioural parameters. The technique works by employing an overlaid meta-optimizer, which is capable of simultaneously tuning parameters with regard to multiple optimization problems, whereas previous approaches to meta-optimization have tuned behavioural parameters to work well on just a single optimization problem. It is then found that not only the PSO method and its simplified variant have comparable performance for optimizing a number of Artificial Neural Network problems, but also the simplified variant appears to offer a small improvement in some cases.

[1]  Frans van den Bergh,et al.  An analysis of particle swarm optimizers , 2002 .

[2]  David E. Goldberg,et al.  A niched Pareto genetic algorithm for multiobjective optimization , 1994, Proceedings of the First IEEE Conference on Evolutionary Computation. IEEE World Congress on Computational Intelligence.

[3]  David E. Goldberg,et al.  Genetic Algorithms in Search Optimization and Machine Learning , 1988 .

[4]  Andrew R. Barron,et al.  Universal approximation bounds for superpositions of a sigmoidal function , 1993, IEEE Trans. Inf. Theory.

[5]  H. Fan A modification to particle swarm optimization algorithm , 2002 .

[6]  Thomas Bäck,et al.  Parallel Optimization of Evolutionary Algorithms , 1994, PPSN.

[7]  Simon Haykin,et al.  Neural Networks: A Comprehensive Foundation , 1998 .

[8]  John J. Grefenstette,et al.  Optimization of Control Parameters for Genetic Algorithms , 1986, IEEE Transactions on Systems, Man, and Cybernetics.

[9]  Geoffrey E. Hinton,et al.  Learning internal representations by error propagation , 1986 .

[10]  Kim Fung Man,et al.  Multiobjective Optimization , 2011, IEEE Microwave Magazine.

[11]  G. Lewicki,et al.  Approximation by Superpositions of a Sigmoidal Function , 2003 .

[12]  X. Yao Evolving Artificial Neural Networks , 1999 .

[13]  Kalyanmoy Deb,et al.  A fast and elitist multiobjective genetic algorithm: NSGA-II , 2002, IEEE Trans. Evol. Comput..

[14]  Kurt Hornik,et al.  Multilayer feedforward networks are universal approximators , 1989, Neural Networks.

[15]  Thomas Bäck,et al.  Evolutionary algorithms in theory and practice - evolution strategies, evolutionary programming, genetic algorithms , 1996 .

[16]  Yuhui Shi,et al.  Particle swarm optimization: developments, applications and resources , 2001, Proceedings of the 2001 Congress on Evolutionary Computation (IEEE Cat. No.01TH8546).

[17]  Robert E. Mercer,et al.  ADAPTIVE SEARCH USING A REPRODUCTIVE META‐PLAN , 1978 .

[18]  Paul J. Werbos,et al.  The Roots of Backpropagation: From Ordered Derivatives to Neural Networks and Political Forecasting , 1994 .

[19]  Rainer Storn,et al.  Differential Evolution – A Simple and Efficient Heuristic for global Optimization over Continuous Spaces , 1997, J. Glob. Optim..

[20]  James Kennedy,et al.  Particle swarm optimization , 2002, Proceedings of ICNN'95 - International Conference on Neural Networks.

[21]  Andrew John Chipperfield,et al.  Tuning Differential Evolution For Artificial Neural Networks , 2008 .

[22]  Thomas G. Dietterich What is machine learning? , 2020, Archives of Disease in Childhood.

[23]  R. Eberhart,et al.  Comparing inertia weights and constriction factors in particle swarm optimization , 2000, Proceedings of the 2000 Congress on Evolutionary Computation. CEC00 (Cat. No.00TH8512).

[24]  Jacques Riget,et al.  A Diversity-Guided Particle Swarm Optimizer - the ARPSO , 2002 .

[25]  A. J. Keane,et al.  Genetic algorithm optimization of multi-peak problems: studies in convergence and robustness , 1995, Artif. Intell. Eng..

[26]  Michael N. Vrahatis,et al.  Recent approaches to global optimization problems through Particle Swarm Optimization , 2002, Natural Computing.

[27]  Xin Yao,et al.  Evolving artificial neural networks , 1999, Proc. IEEE.

[28]  Maurizio Marchese,et al.  A modified particle swarm optimization-based dynamic recurrent neural network for identifying and controlling nonlinear systems , 2007 .

[29]  T. Krink,et al.  Extending particle swarm optimisers with self-organized criticality , 2002, Proceedings of the 2002 Congress on Evolutionary Computation. CEC'02 (Cat. No.02TH8600).

[30]  Gisbert Schneider,et al.  Optimized Particle Swarm Optimization (OPSO) and its application to artificial neural network training , 2006, BMC Bioinformatics.

[31]  Russell C. Eberhart,et al.  Parameter Selection in Particle Swarm Optimization , 1998, Evolutionary Programming.

[32]  M. Clerc,et al.  The swarm and the queen: towards a deterministic and adaptive particle swarm optimization , 1999, Proceedings of the 1999 Congress on Evolutionary Computation-CEC99 (Cat. No. 99TH8406).

[33]  Lutz Prechelt,et al.  PROBEN 1 - a set of benchmarks and benchmarking rules for neural network training algorithms , 1994 .

[34]  Ioan Cristian Trelea,et al.  The particle swarm optimization algorithm: convergence analysis and parameter selection , 2003, Inf. Process. Lett..

[35]  James L. McClelland,et al.  Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations , 1986 .

[36]  Heekuck Oh,et al.  Neural Networks for Pattern Recognition , 1993, Adv. Comput..

[37]  James Kennedy,et al.  The particle swarm: social adaptation of knowledge , 1997, Proceedings of 1997 IEEE International Conference on Evolutionary Computation (ICEC '97).

[38]  Andrew John Chippereld,et al.  Local Unimodal Sampling , 2008 .

[39]  M. Ehrgott Multiobjective Optimization , 2008, AI Mag..

[40]  Zbigniew Michalewicz,et al.  Evolutionary Computation 2 , 2000 .

[41]  P. Werbos,et al.  Beyond Regression : "New Tools for Prediction and Analysis in the Behavioral Sciences , 1974 .

[42]  Yue Shi,et al.  A modified particle swarm optimizer , 1998, 1998 IEEE International Conference on Evolutionary Computation Proceedings. IEEE World Congress on Computational Intelligence (Cat. No.98TH8360).

[43]  Richard Lippmann,et al.  Neural Network Classifiers Estimate Bayesian a posteriori Probabilities , 1991, Neural Computation.

[44]  Magnus Erik,et al.  Tuning Dierential Evolution For Articial Neural Networks , 2008 .

[45]  Thiemo Krink,et al.  The LifeCycle Model: Combining Particle Swarm Optimisation, Genetic Algorithms and HillClimbers , 2002, PPSN.

[46]  Lars Kai Hansen,et al.  Neural Network Ensembles , 1990, IEEE Trans. Pattern Anal. Mach. Intell..

[47]  Maurice Clerc,et al.  The particle swarm - explosion, stability, and convergence in a multidimensional complex space , 2002, IEEE Trans. Evol. Comput..

[48]  Zbigniew Michalewicz,et al.  Handbook of Evolutionary Computation , 1997 .

[49]  Kalyanmoy Deb,et al.  Muiltiobjective Optimization Using Nondominated Sorting in Genetic Algorithms , 1994, Evolutionary Computation.