Boosting multi-step autoregressive forecasts

Multi-step forecasts can be produced recursively by iterating a one-step model, or directly using a specific model for each horizon. Choosing between these two strategies is not an easy task since it involves a trade-off between bias and estimation variance over the forecast horizon. Using a nonlinear machine learning model makes the tradeoff even more difficult. To address this issue, we propose a new forecasting strategy which boosts traditional recursive linear forecasts with a direct strategy using a boosting autoregression procedure at each horizon. First, we investigate the performance of the proposed strategy in terms of bias and variance decomposition of the error using simulated time series. Then, we evaluate the proposed strategy on real-world time series from two forecasting competitions. Overall, we obtain excellent performance with respect to the standard forecasting strategies.

[1]  Amir F. Atiya,et al.  A Bias and Variance Analysis for Multistep-Ahead Time Series Forecasting , 2016, IEEE Transactions on Neural Networks and Learning Systems.

[2]  Souhaib Ben Taieb,et al.  A gradient boosting approach to the Kaggle load forecasting competition , 2014 .

[3]  Joshua B. Tenenbaum,et al.  Structure Discovery in Nonparametric Regression through Compositional Kernel Search , 2013, ICML.

[4]  Gerhard Tutz,et al.  Boosting techniques for nonlinear time series models , 2012 .

[5]  Carl E. Rasmussen,et al.  Additive Gaussian Processes , 2011, NIPS.

[6]  Klaus Wohlrabe,et al.  Forecasting with many predictors - Is boosting a viable alternative? , 2011 .

[7]  Timo Teräsvirta,et al.  Modelling nonlinear economic time series , 2010 .

[8]  Anders Bredahl Kock,et al.  Forecasting with Nonlinear Time Series Models , 2010 .

[9]  P. Bühlmann,et al.  Splines for financial volatility , 2007 .

[10]  Serena Ng,et al.  Boosting diffusion indices , 2009 .

[11]  Gerhard Tutz,et al.  Boosting nonlinear additive autoregressive time series , 2009, Comput. Stat. Data Anal..

[12]  Torsten Hothorn,et al.  Boosting additive models using component-wise P-Splines , 2008, Comput. Stat. Data Anal..

[13]  Hubert Cardot,et al.  A new boosting algorithm for improved time-series forecasting with recurrent neural networks , 2008, Inf. Fusion.

[14]  Durga L. Shrestha,et al.  Experiments with AdaBoost.RT, an Improved Boosting Scheme for Regression , 2006, Neural Computation.

[15]  Rob J Hyndman,et al.  25 years of time series forecasting , 2006 .

[16]  Guillaume Chevillon,et al.  Direct Multi-Step Estimation and Forecasting , 2006 .

[17]  Z. Q. John Lu,et al.  Nonlinear Time Series: Nonparametric and Parametric Methods , 2004, Technometrics.

[18]  Guoqiang Peter Zhang,et al.  An empirical investigation of bias and variance in time series forecasting: modeling considerations and error evaluation , 2003, IEEE Trans. Neural Networks.

[19]  P. Bühlmann,et al.  Volatility estimation with functional gradient descent for very high-dimensional financial time series , 2003 .

[20]  D. Ruppert Selecting the Number of Knots for Penalized Splines , 2002 .

[21]  M. Medeiros,et al.  Building Neural Network Models for Time Series: A Statistical Approach , 2002 .

[22]  J. Friedman Greedy function approximation: A gradient boosting machine. , 2001 .

[23]  P. Bühlmann,et al.  Boosting with the L2-loss: regression and classification , 2001 .

[24]  Spyros Makridakis,et al.  The M3-Competition: results, conclusions and implications , 2000 .

[25]  Amir F. Atiya,et al.  A comparison between neural-network forecasting techniques-case study: river flow forecasting , 1999, IEEE Trans. Neural Networks.

[26]  Harris Drucker,et al.  Improving Regressors using Boosting Techniques , 1997, ICML.

[27]  H. Kantz,et al.  Nonlinear time series analysis , 1997 .

[28]  Yoav Freund,et al.  Experiments with a New Boosting Algorithm , 1996, ICML.

[29]  Paul H. C. Eilers,et al.  Flexible smoothing with B-splines and penalties , 1996 .

[30]  Elie Bienenstock,et al.  Neural Networks and the Bias/Variance Dilemma , 1992, Neural Computation.

[31]  Irma J. Terpenning,et al.  STL : A Seasonal-Trend Decomposition Procedure Based on Loess , 1990 .

[32]  J. Pemberton Forecasting with non-linear time series models , 1987 .