Exploring the sources of uncertainty: Why does bagging for time series forecasting work?

Abstract In a recent study, Bergmeir, Hyndman and Benitez (2016) successfully employed a bootstrap aggregation (bagging) technique for improving the performance of exponential smoothing. Each series is Box-Cox transformed, and decomposed by Seasonal and Trend decomposition using Loess (STL); then bootstrapping is applied on the remainder series before the trend and seasonality are added back, and the transformation reversed to create bootstrapped versions of the series. Subsequently, they apply automatic exponential smoothing on the original series and the bootstrapped versions of the series, with the final forecast being the equal-weight combination across all forecasts. In this study we attempt to address the question: why does bagging for time series forecasting work? We assume three sources of uncertainty (model uncertainty, data uncertainty, and parameter uncertainty) and we separately explore the benefits of bagging for time series forecasting for each one of them. Our analysis considers 4004 time series (from the M- and M3-competitions) and two families of models. The results show that the benefits of bagging predominantly originate from the model uncertainty: the fact that different models might be selected as optimal for the bootstrapped series. As such, a suitable weighted combination of the most suitable models should be preferred to selecting a single model.

[1]  John E. Boylan,et al.  Judging the judges through accuracy-implication metrics: The case of inventory forecasting , 2010 .

[2]  George E. P. Box,et al.  Empirical Model‐Building and Response Surfaces , 1988 .

[3]  Rob J Hyndman,et al.  Another look at measures of forecast accuracy , 2006 .

[4]  M. Neves FORECASTING TIME SERIES WITH BOOT . EXPOS PROCEDURE , 2009 .

[5]  Amir Ahmadi-Javid,et al.  Outpatient appointment systems in healthcare: A review of optimization studies , 2017, Eur. J. Oper. Res..

[6]  S. Kolassa Combining exponential smoothing forecasts using Akaike weights , 2011 .

[7]  Leo Breiman,et al.  Bagging Predictors , 1996, Machine Learning.

[8]  Fotios Petropoulos,et al.  'Horses for Courses' in demand forecasting , 2014, Eur. J. Oper. Res..

[9]  H. Künsch The Jackknife and the Bootstrap for General Stationary Observations , 1989 .

[10]  R Core Team,et al.  R: A language and environment for statistical computing. , 2014 .

[11]  Víctor M. Guerrero Time‐series analysis supported by power transformations , 1993 .

[12]  Spyros Makridakis,et al.  The M3-Competition: results, conclusions and implications , 2000 .

[13]  Fotios Petropoulos,et al.  forecast: Forecasting functions for time series and linear models , 2018 .

[14]  Robert Tibshirani,et al.  The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd Edition , 2001, Springer Series in Statistics.

[15]  D. Politis,et al.  Banded and tapered estimates for autocovariance matrices and the linear process bootstrap , 2010 .

[16]  Rob J Hyndman,et al.  Automatic Time Series Forecasting: The forecast Package for R , 2008 .

[17]  Robert L. Winkler,et al.  The accuracy of extrapolation (time series) methods: Results of a forecasting competition , 1982 .

[18]  Fotios Petropoulos,et al.  An evaluation of simple versus complex selection rules for forecasting many time series , 2014 .

[19]  Aldis Jakubovskis,et al.  Strategic facility location, capacity acquisition, and technology choice decisions under demand uncertainty: Robust vs. non-robust optimization approaches , 2017, Eur. J. Oper. Res..

[20]  P. Bühlmann Sieve bootstrap for time series , 1997 .

[21]  Robert Fildes,et al.  A retail store SKU promotions optimization model for category multi-period profit maximization , 2017, Eur. J. Oper. Res..

[22]  William M. Shyu,et al.  Local Regression Models , 2017 .

[23]  George Athanasopoulos,et al.  Forecasting: principles and practice , 2013 .

[24]  Leonard J. Tashman,et al.  Out-of-sample tests of forecasting accuracy: an analysis and review , 2000 .

[25]  Rob J Hyndman,et al.  Bagging exponential smoothing methods using STL decomposition and Box–Cox transformation , 2016 .

[26]  M. Fitzgerald,et al.  Horses for courses. , 2004, International journal of nursing practice.

[27]  D. Cox,et al.  An Analysis of Transformations , 1964 .

[28]  Irma J. Terpenning,et al.  STL : A Seasonal-Trend Decomposition Procedure Based on Loess , 1990 .