A New Bootstrapped Hybrid Artificial Neural Network Approach for Time Series Forecasting

In this study, a new bootstrapped hybrid artificial neural network is proposed for forecasting. This new neural network provides input significance, linearity and nonlinearity hypothesis tests in a unique network structure via a residual bootstrap approach. The network has three parts: linear, non-linear and a combination with associated weights and biases. These weights are used to test the input significance, linearity and nonlinearity hypotheses with this new method providing empirical distributions for forecasts and weights. The proposed method employs a bagging approach to obtain forecasts. It is then applied to real-time series including the M4 Competition data set and stock exchange time series where its performance is compared with appropriate benchmark methods including other popular neural networks. The proposed method results are less affected than other neural networks by initial random weights, which means that the results of the proposed method are more stable and precise. The new method provides improvements in forecasting accuracy over the established benchmarks.

[1]  Sven F. Crone,et al.  CorrigendumCorrigendum to “Advances in forecasting with neural networks? Empirical evidence from the NN3 competition on time series prediction” [Int. J. Forecast. 27 (2011) 635–660] , 2014 .

[2]  Guido Masarotto,et al.  Bootstrap prediction intervals for autoregressions , 1990 .

[3]  Clive W. J. Granger,et al.  Forecasting stock market prices: Lessons for forecasters , 1992 .

[4]  J. Keith Ord,et al.  Automatic neural network modeling for univariate time series , 2000 .

[5]  Dervis Karaboga,et al.  AN IDEA BASED ON HONEY BEE SWARM FOR NUMERICAL OPTIMIZATION , 2005 .

[6]  Slawek Smyl,et al.  A hybrid method of exponential smoothing and recurrent neural networks for time series forecasting , 2020, International Journal of Forecasting.

[7]  F. C. Oliveira,et al.  Improving time series forecasting: An approach combining bootstrap aggregation, clusters and exponential smoothing , 2018, International Journal of Forecasting.

[8]  Ulrich Anders,et al.  Model selection in neural networks , 1999, Neural Networks.

[9]  Henri Nyberg,et al.  Forecasting the direction of the US stock market with dynamic binary probit models , 2011 .

[10]  Michael D. Bradley,et al.  Forecasting with a nonlinear dynamic model of stock returns and industrial production , 2004 .

[11]  Erol Egrioglu,et al.  An ensemble of single multiplicative neuron models for probabilistic prediction , 2016, 2016 IEEE Symposium Series on Computational Intelligence (SSCI).

[12]  Jocelyn Barker Machine learning in M4: What makes a good unstructured model? , 2020 .

[13]  P. Luukka,et al.  Classification of intraday S&P500 returns with a Random Forest , 2019, International Journal of Forecasting.

[14]  Sven F. Crone,et al.  Cross-validation aggregation for combining autoregressive neural network forecasts , 2016 .

[15]  Erol Egrioglu,et al.  Probabilistic forecasting, linearity and nonlinearity hypothesis tests with bootstrapped linear and nonlinear artificial neural network , 2019, J. Exp. Theor. Artif. Intell..

[16]  Chandranath Chatterjee,et al.  Development of an accurate and reliable hourly flood forecasting model using wavelet–bootstrap–ANN (WBANN) hybrid approach , 2010 .

[17]  Chandranath Chatterjee,et al.  Uncertainty assessment and ensemble flood forecasting using bootstrap based artificial neural networks (BANNs) , 2010 .

[18]  Robert Fildes,et al.  Learning from forecasting competitions , 2020 .

[19]  Wilpen L. Gorr,et al.  Comparative study of artificial neural network and statistical models for predicting student grade point averages , 1994 .

[20]  Nikolaos Kourentzes,et al.  Neural network ensemble operators for time series forecasting , 2014, Expert Syst. Appl..

[21]  H. White,et al.  An additional hidden unit test for neglected nonlinearity in multilayer feedforward networks , 1989, International 1989 Joint Conference on Neural Networks.

[22]  D. Politis,et al.  Bootstrap prediction intervals for linear, nonlinear, and nonparametric autoregressions , 2016 .

[23]  Sven F. Crone,et al.  Advances in forecasting with neural networks? Empirical evidence from the NN3 competition on time series prediction , 2011 .

[24]  Timo Teräsvirta,et al.  POWER OF THE NEURAL NETWORK LINEARITY TEST , 1993 .

[25]  Nicholas Sarantis,et al.  Nonlinearities, cyclical behaviour and predictability in stock markets: international evidence , 2001 .

[26]  Rob J. Hyndman,et al.  A brief history of forecasting competitions , 2020 .

[27]  C. Granger,et al.  Efficient Market Hypothesis and Forecasting , 2002 .

[28]  Dennis Olson,et al.  Neural network forecasts of Canadian stock returns using accounting ratios , 2003 .

[29]  Apostolos-Paul N. Refenes,et al.  Neural model identification, variable selection and model adequacy , 1999 .

[30]  Clive W. J. Granger,et al.  Testing for neglected nonlinearity in time series models: A comparison of neural network methods and alternative tests , 1993 .

[31]  David G. McMillan,et al.  Non-linear forecasting of stock returns: Does volume help? , 2007 .

[32]  Evangelos Spiliotis,et al.  The M4 Competition: Results, findings, conclusion and way forward , 2018, International Journal of Forecasting.

[33]  Reza Ebrahimpour,et al.  Mixture of MLP-experts for trend forecasting of time series: A case study of the Tehran stock exchange , 2011 .

[34]  Achilleas Zapranis,et al.  Principles of Neural Model Identification, Selection and Adequacy: With Applications to Financial Econometrics , 1999 .

[35]  Dervis Karaboga,et al.  Artificial Bee Colony (ABC) Optimization Algorithm for Training Feed-Forward Neural Networks , 2007, MDAI.

[36]  M. Veall,et al.  Bootstrap prediction intervals for single period regression forecasts , 2002 .

[37]  Karol Szafranek,et al.  Bagged neural networks for forecasting Polish (low) inflation , 2019, International Journal of Forecasting.