Statistical inference, the bootstrap, and neural-network modeling with application to foreign exchange rates

We propose tests for individual and joint irrelevance of network inputs. Such tests can be used to determine whether an input or group of inputs "belong" in a particular model, thus permitting valid statistical inference based on estimated feedforward neural-network models. The approaches employ well-known statistical resampling techniques. We conduct a small Monte Carlo experiment showing that our tests have reasonable level and power behavior, and we apply our methods to examine whether there are predictable regularities in foreign exchange rates. We find that exchange rates do appear to contain information that is exploitable for enhanced point prediction, but the nature of the predictive relations evolves through time.

[1]  R. Gencay,et al.  An Introduc-tion to High-Frequency Finance , 2001 .

[2]  G. Schwarz Estimating the Dimension of a Model , 1978 .

[3]  Chung-Ming Kuan,et al.  Forecasting exchange rates using feedforward and recurrent neural networks , 1992 .

[4]  W. Baxt Analysis of the clinical variables driving decision in an artificial neural network trained to identify the presence of myocardial infarction. , 1992, Annals of emergency medicine.

[5]  C. Dunis,et al.  Nonlinear modelling of high frequency financial time series , 1998 .

[6]  B. LeBaron Technical Trading Rules and Regime Shifts in Foreign Exchange , 1991 .

[7]  R. Engle A general approach to lagrange multiplier model diagnostics , 1982 .

[8]  H. Markowitz,et al.  The Random Character of Stock Market Prices. , 1965 .

[9]  B. M. Pötscher Effects of Model Selection on Inference , 1991, Econometric Theory.

[10]  W. Cleveland,et al.  Regression by local fitting: Methods, properties, and computational algorithms , 1988 .

[11]  Halbert White,et al.  Improved Rates and Asymptotic Normality for Nonparametric Neural Network Estimators , 1999, IEEE Trans. Inf. Theory.

[12]  Halbert White,et al.  Bootstrapping Confidence Intervals for Clinical Input Variable Effects in a Network Trained to Identify the Presence of Acute Myocardial Infarction , 1995, Neural Computation.

[13]  Halbert White,et al.  Learning in Artificial Neural Networks: A Statistical Perspective , 1989, Neural Computation.

[14]  H. White,et al.  Information criteria for selecting possibly misspecified parametric models , 1996 .

[15]  Héctor J. Sussmann,et al.  Uniqueness of the weights for minimal feedforward nets with a given input-output map , 1992, Neural Networks.

[16]  James M. Nason,et al.  Nonparametric exchange rate prediction , 1990 .

[17]  Y. Hochberg A sharper Bonferroni procedure for multiple tests of significance , 1988 .

[18]  E. Ziegel Introduction to the Theory and Practice of Econometrics , 1989 .

[19]  E. Fama The Behavior of Stock-Market Prices , 1965 .

[20]  R. Jennrich Asymptotic Properties of Non-Linear Least Squares Estimators , 1969 .

[21]  P. Hall The Bootstrap and Edgeworth Expansion , 1992 .

[22]  Andrew W. Lo,et al.  Computational finance , 1999, Comput. Sci. Eng..

[23]  Virginia L. Stonick,et al.  Topology and Geometry of Single Hidden Layer Network, Least Squares Weight Solutions , 1995, Neural Computation.

[24]  Blake LeBaron,et al.  Computational finance 1999 , 2000 .

[25]  B. Efron The jackknife, the bootstrap, and other resampling plans , 1987 .

[26]  H. White,et al.  Cross-Validation Estimates IMSE , 1993, NIPS 1993.

[27]  Robert Hecht-Nielsen,et al.  On the Geometry of Feedforward Neural Network Error Surfaces , 1993, Neural Computation.