Bootstrapping Neural Networks

Knowledge about the distribution of a statistical estimator is important for various purposes, such as the construction of confidence intervals for model parameters or the determination of critical values of tests. A widely used method to estimate this distribution is the so-called bootstrap, which is based on an imitation of the probabilistic structure of the data-generating process on the basis of the information provided by a given set of random observations. In this article we investigate this classical method in the context of artificial neural networks used for estimating a mapping from input to output space. We establish consistency results for bootstrap estimates of the distribution of parameter estimates.

[1]  Jörg Polzehl,et al.  Model Selection, Transformations and Variance Estimation in Nonlinear Regression , 1999 .

[2]  A. Messéan,et al.  Some simulations results about confidence intervals and bootstrap methods in nonlinear regression , 1990 .

[3]  W. Härdle Applied Nonparametric Regression , 1992 .

[4]  Robert Tibshirani,et al.  An Introduction to the Bootstrap , 1994 .

[5]  S. Huet,et al.  Exactitude au second ordre des intervalles de confiance bootstrap pour les paramètres d'un modèle de régression non linéaire , 1989 .

[6]  Michael H. Neumann,et al.  Regression-type inference in nonparametric autoregression , 1998 .

[7]  Halbert White,et al.  Connectionist nonparametric regression: Multilayer feedforward networks can learn arbitrary mappings , 1990, Neural Networks.

[8]  Enno Mammen,et al.  Properties of the nonparametric autoregressive bootstrap , 2002 .

[9]  Michael H. Neumann,et al.  Bootstrap Confidence Bands For The Autoregression Function , 1996 .

[10]  Halbert White,et al.  Bootstrapping Confidence Intervals for Clinical Input Variable Effects in a Network Trained to Identify the Presence of Acute Myocardial Infarction , 1995, Neural Computation.

[11]  W. Härdle,et al.  Bootstrapping in Nonparametric Regression: Local Adaptive Smoothing and Confidence Bands , 1988 .

[12]  H. White Some Asymptotic Results for Learning in Single Hidden-Layer Feedforward Network Models , 1989 .

[13]  J. Shao,et al.  The jackknife and bootstrap , 1996 .

[14]  E. Mammen,et al.  Bootstrap of kernel smoothing in nonlinear time series , 2002 .

[15]  James Stephen Marron,et al.  BOOTSTRAP SIMULTANEOUS ERROR BARS FOR NONPARAMETRIC REGRESSION , 1991 .

[16]  Apostolos-Paul N. Refenes,et al.  Neural model identification, variable selection and model adequacy , 1999 .

[17]  D. Freedman Bootstrapping Regression Models , 1981 .

[18]  P. Hall The Bootstrap and Edgeworth Expansion , 1992 .

[19]  Shun-ichi Amari,et al.  Network information criterion-determining the number of hidden units for an artificial neural network model , 1994, IEEE Trans. Neural Networks.

[20]  Halbert White,et al.  Learning in Artificial Neural Networks: A Statistical Perspective , 1989, Neural Computation.

[21]  Kurt Hornik,et al.  Multilayer feedforward networks are universal approximators , 1989, Neural Networks.

[22]  W. Härdle,et al.  Optimal Bandwidth Selection in Nonparametric Regression Function Estimation , 1985 .

[23]  J. Franke,et al.  BOOTSTRAPPING STATIONARY AUTOREGRESSIVE MOVING-AVERAGE MODELS , 1992 .