A Self-Organizing Mixture Extreme Leaning Machine for Time Series Forecasting *

A novel self-organizing Mixture Extreme Learning Machine (SOM-LEM) model and algorithm for time series forecasting is proposed in this paper. As the stock time series is non-stationary stochastic processes which switch their dynamics from time to time or have different models in different periods, and the ELM algorithm also has some drawbacks such as when the numbers of the samples and hidden nodes are very large, the calculation of the Moore-Penrose generalized inverse of matrixH will become very complicated, and the corresponding error of the elements in the matrix will become larger, and thus the generalization performance of the network will be reduced. These imply that it is not convincing and impractical for a single parametric model to capture the dynamics of the entire time series. So a SOM competitive layer is added in front of the ELM network to form the SOM-ELM model, in which, each category samples divided by SOM is then handled by a ELM model. The better generalization performance of the SOM-ELM algorithm are verified through some experiments with practical stock time series.

[1]  Alejandra Cabaña,et al.  Weak Convergence of Marked Empirical Processes for Focused Inference on AR(p) vs AR(p + 1) Stationary Time Series , 2012 .

[2]  Pascal Hitzler,et al.  Perspectives of Neural-Symbolic Integration , 2007, Studies in Computational Intelligence.

[3]  A. Harvey Time series models , 1983 .

[4]  Y. Wang,et al.  Analysis and modeling of multivariate chaotic time series based on neural network , 2009, Expert Syst. Appl..

[5]  Han Xu-li,et al.  The multidimensional function approximation based on constructive wavelet RBF neural network , 2011 .

[6]  Guilherme De A. Barreto,et al.  Time Series Prediction with the Self-Organizing Map: A Review , 2007, Perspectives of Neural-Symbolic Integration.

[7]  Ratnadip Adhikari,et al.  Forecasting strong seasonal time series with artificial neural networks , 2012 .

[8]  Guang-Bin Huang,et al.  Learning capability and storage capacity of two-hidden-layer feedforward networks , 2003, IEEE Trans. Neural Networks.

[9]  Hou Muzhou,et al.  Multivariate numerical approximation using constructive $$ L^{2} (\mathbb{R}) $$ RBF neural network , 2012 .

[10]  A. Lapedes,et al.  Nonlinear signal processing using neural networks: Prediction and system modelling , 1987 .

[11]  Jun Wang,et al.  Fluctuation prediction of stock market index by Legendre neural network with random time strength function , 2012, Neurocomputing.

[12]  Chee Kheong Siew,et al.  Extreme learning machine: Theory and applications , 2006, Neurocomputing.

[13]  Servet Soyguder Intelligent control based on wavelet decomposition and neural network for predicting of human trajectories with a novel vision-based robotic , 2011, Expert Syst. Appl..

[14]  Teuvo Kohonen,et al.  The self-organizing map , 1990 .

[15]  Hujun Yin,et al.  A self-organising mixture autoregressive network for FX time series modelling and prediction , 2009, Neurocomputing.

[16]  Guang-Bin Huang,et al.  Convex incremental extreme learning machine , 2007, Neurocomputing.

[17]  Milton S. Boyd,et al.  Designing a neural network for forecasting financial and economic time series , 1996, Neurocomputing.

[18]  Moon Ho Lee,et al.  A new constructive neural network method for noise processing and its application on stock market prediction , 2014, Appl. Soft Comput..

[19]  Guoqiang Peter Zhang,et al.  A neural network ensemble method with jittered training data for time series forecasting , 2007, Inf. Sci..

[20]  Lei Chen,et al.  Enhanced random search based incremental extreme learning machine , 2008, Neurocomputing.

[21]  Weizhong Yan,et al.  Toward Automatic Time-Series Forecasting Using Neural Networks , 2012, IEEE Transactions on Neural Networks and Learning Systems.

[22]  Kyong Joo Oh,et al.  Analyzing Stock Market Tick Data Using Piecewise Nonlinear Model , 2022 .

[23]  Kin Keung Lai,et al.  Neural Networks in Finance and Economics Forecasting , 2007, Int. J. Inf. Technol. Decis. Mak..

[24]  Atanu Biswas,et al.  Statistical analysis of discrete-valued time series using categorical ARMA models , 2013, Comput. Stat. Data Anal..

[25]  Xuli Han,et al.  Constructive Approximation to Multivariate Function by Decay RBF Neural Network , 2010, IEEE Transactions on Neural Networks.

[26]  Mario Graff,et al.  Evolutive Design of ARMA and ANN Models for Time Series Forecasting , 2012 .

[27]  Luiz Koodi Hotta,et al.  A note on influence diagnostics in AR(1) time series models , 2012 .

[28]  Kwok-wing Chau,et al.  A hybrid adaptive time-delay neural network model for multi-step-ahead prediction of sunspot activity , 2006 .

[29]  Michael Y. Hu,et al.  Forecasting with artificial neural networks: The state of the art , 1997 .

[30]  Chien-Jen Huang,et al.  Using multi-stage data mining technique to build forecast model for Taiwan stocks , 2011, Neural Computing and Applications.

[31]  Holger R. Maier,et al.  Neural networks for the prediction and forecasting of water resource variables: a review of modelling issues and applications , 2000, Environ. Model. Softw..

[32]  Amaury Lendasse,et al.  Adaptive Ensemble Models of Extreme Learning Machines for Time Series Prediction , 2009, ICANN.

[33]  Aapo Hyvärinen,et al.  Learning Features by Contrasting Natural Images with Noise , 2009, ICANN.

[34]  Carlos E. Pedreira,et al.  Neural networks for short-term load forecasting: a review and evaluation , 2001 .

[35]  Hou Muzhou,et al.  Constructive approximation to real function by wavelet neural networks , 2008 .

[36]  Purwanto,et al.  An enhanced hybrid method for time series prediction using linear and neural network models , 2012, Applied Intelligence.

[37]  Shang-Wu Yu Forecasting and Arbitrage of the Nikkei Stock Index Futures: An Application of Backpropagation Networks , 1999 .

[38]  Guang-Bin Huang,et al.  Extreme learning machine: a new learning scheme of feedforward neural networks , 2004, 2004 IEEE International Joint Conference on Neural Networks (IEEE Cat. No.04CH37541).