Chapter 13 Bagging Binary and Quantile Predictors for Time Series: Further Issues

Bagging (bootstrap aggregating) is a smoothing method to improve predictive ability under the presence of parameter estimation uncertainty and model uncertainty. In Lee and Yang (2006), we examined how (equal-weighted and BMA-weighted) bagging works for one-step-ahead binary prediction with an asymmetric cost function for time series, where we considered simple cases with particular choices of a linlin tick loss function and an algorithm to estimate a linear quantile regression model. In the present chapter, we examine how bagging predictors work with different aggregating (averaging) schemes, for multi-step forecast horizons, with a general class of tick loss functions, with different estimation algorithms, for nonlinear quantile regression models, and for different data frequencies. Bagging quantile predictors are constructed via (weighted) averaging over predictors trained on bootstrapped training samples, and bagging binary predictors are conducted via (majority) voting on predictors trained on the bootstrapped training samples. We find that median bagging and trimmed-mean bagging can alleviate the problem of extreme predictors from bootstrap samples and have better performance than equally weighted bagging predictors; that bagging works better at longer forecast horizons; that bagging works well with highly nonlinear quantile regression models (e.g., artificial neural network), and with general tick loss functions. We also find that the performance of bagging may be affected by using different quantile estimation algorithms (in small samples, even if the estimation is consistent) and by using different frequencies of time series data.

[1]  V. Chernozhukov,et al.  An MCMC Approach to Classical Estimation , 2002, 2301.07782.

[2]  David F. Hendry,et al.  Non-Parametric Direct Multi-Step Estimation for Forecasting Economic Processes , 2004 .

[3]  Moshe Buchinsky Recent Advances in Quantile Regression Models: A Practical Guideline for Empirical Research , 1998 .

[4]  J. B. G. Frenk,et al.  A deep cut ellipsoid algorithm for convex programming: Theory and applications , 1994, Math. Program..

[5]  C. Granger,et al.  Handbook of Economic Forecasting , 2006 .

[6]  C. Granger,et al.  Economic and Statistical Measures of Forecast Accuracy , 1999 .

[7]  Victor Chernozhukov,et al.  Conditional value-at-risk: Aspects of modeling and estimation , 2000 .

[8]  C. Granger,et al.  Forecasting from non-linear models in practice , 1994 .

[9]  Leo Breiman,et al.  Bagging Predictors , 1996, Machine Learning.

[10]  J. Stock,et al.  A Comparison of Linear and Nonlinear Univariate Models for Forecasting Macroeconomic Time Series , 1998 .

[11]  M. Wand,et al.  EXACT MEAN INTEGRATED SQUARED ERROR , 1992 .

[12]  Francis X. Diebold,et al.  The Rodney L. White Center for Financial Research Financial Asset Returns, Direction-of-Change Forecasting and Volatility , 2003 .

[13]  Michael P. Clements,et al.  Forecasting Non-Stationary Economic Time Series , 1999 .

[14]  Lutz Kilian,et al.  How Useful is Bagging in Forecasting Economic Time Series? A Case Study of Us CPI Inflation , 2005 .

[15]  D. Madigan,et al.  Bayesian Model Averaging for Linear Regression Models , 1997 .

[16]  Ruey S. Tsay,et al.  Co‐integration constraint and forecasting: An empirical examination , 1996 .

[17]  Yang Yang,et al.  Bagging binary and quantile predictors for time series , 2006 .

[18]  Ruey S. Tsay,et al.  Comment: Adaptive Forecasting , 1993 .

[19]  Q. Vuong,et al.  Efficientt Conditional Quantile Estimation: The Time Series Case , 2006 .

[20]  A. Timmermann Chapter 4 Forecast Combinations , 2006 .

[21]  M. Hashem Pesaran,et al.  Selection of estimation window in the presence of breaks , 2007 .

[22]  Todd E. Clark,et al.  Improving Forecast Accuracy by Combining Recursive and Rolling Forecasts , 2008 .

[23]  A. Timmermann Forecast Combinations , 2005 .

[24]  J. Powell,et al.  Censored regression quantiles , 1986 .

[25]  R. Koenker,et al.  An interior point algorithm for nonlinear quantile regression , 1996 .

[26]  R. Koenker,et al.  Asymptotic Theory of Least Absolute Error Regression , 1978 .

[27]  R. Koenker,et al.  The Gaussian hare and the Laplacian tortoise: computability of squared-error versus absolute-error estimators , 1997 .

[28]  R. Engle,et al.  CAViaR , 1999 .

[29]  H. Chipman,et al.  Bayesian CART Model Search , 1998 .

[30]  Ivana Komunjer,et al.  Quasi-maximum likelihood estimation for conditional quantiles , 2005 .

[31]  Herbert K. H. Lee Consistency of posterior distributions for neural networks , 2000, Neural Networks.

[32]  Bernd Fitzenberger,et al.  The moving blocks bootstrap and robust inference for linear least squares and quantile regressions , 1998 .

[33]  R. Mariano,et al.  Residual-Based Procedures for Prediction and Estimation in a Nonlinear Simultaneous System , 1984 .

[34]  H. White Nonparametric Estimation of Conditional Quantiles Using Neural Networks , 1990 .

[35]  Mark W. Watson,et al.  AN EMPIRICAL COMPARISON OF METHODS FOR FORECASTING USING MANY PREDICTORS , 2005 .