Chaotic Time Series Prediction Based on Improved MOPSO-ELM

Extreme learning machines have attracted increasing attention for setting the input weights and hidden biases randomly. However, excessive hidden neurons in ELMs may cause ill-posed problem when Moore-Penrose generalized inverse method is used to train the neural network. It generates large amplitude of output weights, which leads to bad generalization. To solve this problem, we propose an improved multi-objective particle swarm optimization algorithm to optimize the output weights of ELMs. It consists of two objective functions. One is the sum of the errors between predicted values and target values. The other one is the $L_{2}$ norm of output weights. We improve general MOPSO algorithm by adding chaotic mapping to promote the ergodicity and randomness of new solutions. In addition, a new strategy is proposed to select $pbest$ and $gbest$ to deal with the unbalance of multiple objectives. Experimental results on predicting two benchmark chaotic time series sustain the efficiency and effectiveness of our proposed algorithm. It has better generalization performance and smaller output weights than original ELM and other evolutionary ELMs.

[1]  M Reyes Sierra,et al.  Multi-Objective Particle Swarm Optimizers: A Survey of the State-of-the-Art , 2006 .

[2]  James Kennedy,et al.  Particle swarm optimization , 2002, Proceedings of ICNN'95 - International Conference on Neural Networks.

[3]  Duehee Lee,et al.  Short-Term Wind Power Ensemble Prediction Based on Gaussian Processes and Neural Networks , 2014, IEEE Transactions on Smart Grid.

[4]  Kwok-wing Chau,et al.  Data-driven input variable selection for rainfall-runoff modeling using binary-coded particle swarm optimization and Extreme Learning Machines , 2015 .

[5]  Carlos A. Coello Coello,et al.  Handling multiple objectives with particle swarm optimization , 2004, IEEE Transactions on Evolutionary Computation.

[6]  I. Boumhidi,et al.  Optimal control for a variable speed wind turbine based on extreme learning machine and adaptive Particle Swarm Optimization , 2016, 2016 5th International Conference on Systems and Control (ICSC).

[7]  M. V. Kulikova,et al.  Gradient-Based Parameter Estimation in Pairwise Linear Gaussian System , 2017, IEEE Transactions on Automatic Control.

[8]  Saman K. Halgamuge,et al.  Self-organizing hierarchical particle swarm optimizer with time-varying acceleration coefficients , 2004, IEEE Transactions on Evolutionary Computation.

[9]  Yaochu Jin,et al.  Evolutionary multi-objective generation of recurrent neural network ensembles for time series prediction , 2014, Neurocomputing.

[10]  Kalyanmoy Deb,et al.  A fast and elitist multiobjective genetic algorithm: NSGA-II , 2002, IEEE Trans. Evol. Comput..

[11]  Chee Kheong Siew,et al.  Extreme learning machine: Theory and applications , 2006, Neurocomputing.

[12]  Kit Po Wong,et al.  Probabilistic Forecasting of Wind Power Generation Using Extreme Learning Machine , 2014, IEEE Transactions on Power Systems.

[13]  Yanika Kongsorot,et al.  Improved convex incremental extreme learning machine based on ridgelet and PSO algorithm , 2016, 2016 13th International Joint Conference on Computer Science and Software Engineering (JCSSE).

[14]  Dianhui Wang,et al.  Extreme learning machines: a survey , 2011, Int. J. Mach. Learn. Cybern..

[15]  Fei Han,et al.  An improved evolutionary extreme learning machine based on particle swarm optimization , 2013, Neurocomputing.

[16]  Cristiano Cervellera,et al.  An Extreme Learning Machine Approach to Density Estimation Problems , 2017, IEEE Transactions on Cybernetics.