Optimizing the quality of bootstrap-based prediction intervals

The bootstrap method is one of the most widely used methods in literature for construction of confidence and prediction intervals. This paper proposes a new method for improving the quality of bootstrap-based prediction intervals. The core of the proposed method is a prediction interval-based cost function, which is used for training neural networks. A simulated annealing method is applied for minimization of the cost function and neural network parameter adjustment. The developed neural networks are then used for estimation of the target variance. Through experiments and simulations it is shown that the proposed method can be used to construct better quality bootstrap-based prediction intervals. The optimized prediction intervals have narrower widths with a greater coverage probability compared to traditional bootstrap-based prediction intervals.

[1]  Amir F. Atiya,et al.  Lower Upper Bound Estimation Method for Construction of Neural Network-Based Prediction Intervals , 2011, IEEE Transactions on Neural Networks.

[2]  Kurt Hornik,et al.  Multilayer feedforward networks are universal approximators , 1989, Neural Networks.

[3]  J. T. Hwang,et al.  Prediction Intervals for Artificial Neural Networks , 1997 .

[4]  A. Weigend,et al.  Estimating the mean and variance of the target probability distribution , 1994, Proceedings of 1994 IEEE International Conference on Neural Networks (ICNN'94).

[5]  Jay Lee,et al.  Feature signature prediction of a boring process using neural network modeling with confidence bounds , 2006 .

[6]  H. J. Van Zuylen,et al.  Bayesian committee of neural networks to predict travel times with confidence intervals , 2009 .

[7]  Robert Tibshirani,et al.  A Comparison of Some Error Estimates for Neural Network Models , 1996, Neural Computation.

[8]  Tom Heskes,et al.  Practical Confidence and Prediction Intervals , 1996, NIPS.

[9]  Graham Currie,et al.  Prediction intervals to account for uncertainties in neural network predictions: Methodology and application in bus travel time prediction , 2011, Eng. Appl. Artif. Intell..

[10]  David J. C. MacKay,et al.  The Evidence Framework Applied to Classification Networks , 1992, Neural Computation.

[11]  Tao Lu,et al.  Prediction of indoor temperature and relative humidity using neural network models: model comparison , 2009, Neural Computing and Applications.

[12]  E. Zio,et al.  A study of the bootstrap method for estimating the accuracy of artificial neural networks in predicting nuclear transient processes , 2006, IEEE Transactions on Nuclear Science.

[13]  C. D. Gelatt,et al.  Optimization by Simulated Annealing , 1983, Science.

[14]  Heekuck Oh,et al.  Neural Networks for Pattern Recognition , 1993, Adv. Comput..

[15]  R. D. Veaux,et al.  Prediction intervals for neural networks via nonlinear regression , 1998 .

[16]  David Hinkley,et al.  Bootstrap Methods: Another Look at the Jackknife , 2008 .

[17]  Emile H. L. Aarts,et al.  Simulated Annealing: Theory and Applications , 1987, Mathematics and Its Applications.

[18]  Saeid Nahavandi,et al.  Load Forecasting and Neural Networks: A Prediction Interval-Based Perspective , 2010 .

[19]  Saeid Nahavandi,et al.  A prediction interval-based approach to determine optimal structures of neural network metamodels , 2010, Expert Syst. Appl..

[20]  Saeid Nahavandi,et al.  Prediction Intervals to Account for Uncertainties in Travel Time Prediction , 2011, IEEE Transactions on Intelligent Transportation Systems.

[21]  Saeid Nahavandi,et al.  Construction of Optimal Prediction Intervals for Load Forecasting Problems , 2010, IEEE Transactions on Power Systems.