Noisy Time Series Prediction using Recurrent Neural Networks and Grammatical Inference

Financial forecasting is an example of a signal processing problem which is challenging due to small sample sizes, high noise, non-stationarity, and non-linearity. Neural networks have been very successful in a number of signal processing applications. We discuss fundamental limitations and inherent difficulties when using neural networks for the processing of high noise, small sample size signals. We introduce a new intelligent signal processing method which addresses the difficulties. The method proposed uses conversion into a symbolic representation with a self-organizing map, and grammatical inference with recurrent neural networks. We apply the method to the prediction of daily foreign exchange rates, addressing difficulties with non-stationarity, overfitting, and unequal a priori class probabilities, and we find significant predictability in comprehensive experiments covering 5 different foreign exchange rates. The method correctly predicts the directionof change for the next day with an error rate of 47.1%. The error rate reduces to around 40% when rejecting examples where the system has low confidence in its prediction. We show that the symbolic representation aids the extraction of symbolic knowledge from the trained recurrent neural networks in the form of deterministic finite state automata. These automata explain the operation of the system and are often relatively simple. Automata rules related to well known behavior such as tr end following and mean reversal are extracted.

[1]  András Faragó,et al.  Strong universal consistency of neural network classifiers , 1993, IEEE Trans. Inf. Theory.

[2]  Bruce W. Suter,et al.  The multilayer perceptron as an approximation to a Bayes optimal discriminant function , 1990, IEEE Trans. Neural Networks.

[3]  E. Fama The Behavior of Stock-Market Prices , 1965 .

[4]  C. Lee Giles,et al.  Extracting and Learning an Unknown Grammar with Recurrent Neural Networks , 1991, NIPS.

[5]  守屋 悦朗,et al.  J.E.Hopcroft, J.D. Ullman 著, "Introduction to Automata Theory, Languages, and Computation", Addison-Wesley, A5変形版, X+418, \6,670, 1979 , 1980 .

[6]  Michael I. Jordan,et al.  Factorial Hidden Markov Models , 1995, Machine Learning.

[7]  John E. Moody,et al.  Note on Learning Rate Schedules for Stochastic Optimization , 1990, NIPS.

[8]  Lawrence D. Jackel,et al.  Large Automatic Learning, Rule Extraction, and Generalization , 1987, Complex Syst..

[9]  Ah Chung Tsoi,et al.  FIR and IIR Synapses, a New Neural Network Architecture for Time Series Modeling , 1991, Neural Computation.

[10]  Toshio Odanaka,et al.  ADAPTIVE CONTROL PROCESSES , 1990 .

[11]  J. Moody Economic forecasting : challenges and neural network solutions , 1995 .

[12]  Stephen L Taylor,et al.  Modelling Financial Time Series , 1987 .

[13]  Jude Shavlik,et al.  THE EXTRACTION OF REFINED RULES FROM KNOWLEDGE BASED NEURAL NETWORKS , 1993 .

[14]  B. Malkiel A Random Walk Down Wall Street , 1973 .

[15]  Sandiway Fong,et al.  Natural Language Grammatical Inference with Recurrent Neural Networks , 2000, IEEE Trans. Knowl. Data Eng..

[16]  Sholom M. Weiss,et al.  Rule-based Machine Learning Methods for Functional Prediction , 1995, J. Artif. Intell. Res..

[17]  Simon Haykin,et al.  Neural Networks: A Comprehensive Foundation , 1998 .

[18]  G. U. Yule,et al.  The Foundations of Econometric Analysis: On a Method of Investigating Periodicities in Disturbed Series, with Special Reference to Wolfer's Sunspot Numbers ( Philosophical Transactions of the Royal Society of London , A, vol. 226, 1927, pp. 267–73) , 1995 .

[19]  Yaser S. Abu-Mostafa,et al.  Learning from hints in neural networks , 1990, J. Complex..

[20]  Teuvo Kohonen,et al.  Self-Organizing Maps , 2010 .

[21]  Jerome H. Friedman,et al.  An Overview of Predictive Learning and Function Approximation , 1994 .

[22]  C. Granger,et al.  Forecasting Economic Time Series. , 1988 .

[23]  José Carlos Príncipe,et al.  Dynamic Modelling of Chaotic Time Series with Neural Networks , 1994, NIPS.

[24]  J. Elman Distributed Representations, Simple Recurrent Networks, And Grammatical Structure , 1991 .

[25]  A. Lo,et al.  A Non-Random Walk Down Wall Street , 1999 .

[26]  Lester Ingber,et al.  Statistical Mechanics of Nonlinear Nonequilibrium Financial Markets: Applications to Optimized Trading , 1996 .

[27]  W. Andrew,et al.  LO, and A. , 1988 .

[28]  Athanasios Kehagias,et al.  Time-Series Segmentation Using Predictive Modular Neural Networks , 1997, Neural Computation.

[29]  A. N. Sharkovskiĭ Dynamic systems and turbulence , 1989 .

[30]  F. Girosi,et al.  Nonlinear prediction of chaotic time series using support vector machines , 1997, Neural Networks for Signal Processing VII. Proceedings of the 1997 IEEE Signal Processing Society Workshop.

[31]  Joachim Diederich,et al.  The truth will come to light: directions and challenges in extracting the knowledge embedded within trained artificial neural networks , 1998, IEEE Trans. Neural Networks.

[32]  Masumi Ishikawa,et al.  Rule extraction by successive regularization , 1996, Proceedings of International Conference on Neural Networks (ICNN'96).

[33]  Andrew R. Barron,et al.  Universal approximation bounds for superpositions of a sigmoidal function , 1993, IEEE Trans. Inf. Theory.

[34]  Yoichi Hayashi,et al.  A Neural Expert System with Automated Extraction of Fuzzy If-Then Rules , 1990, NIPS.

[35]  Michael A. Arbib,et al.  The handbook of brain theory and neural networks , 1995, A Bradford book.

[36]  Raymond L. Watrous,et al.  Induction of Finite-State Languages Using Second-Order Recurrent Networks , 1992, Neural Computation.

[37]  James L. McClelland,et al.  Finite State Automata and Simple Recurrent Networks , 1989, Neural Computation.

[38]  Jeffrey D. Ullman,et al.  Introduction to Automata Theory, Languages and Computation , 1979 .

[39]  D. Signorini,et al.  Neural networks , 1995, The Lancet.

[40]  Ah Chung Tsoi,et al.  Neural Network Classification and Prior Class Probabilities , 1996, Neural Networks: Tricks of the Trade.

[41]  C. Lee Giles,et al.  Learning a class of large finite state machines with a recurrent neural network , 1995, Neural Networks.

[42]  Burton G. Malkiel,et al.  Efficient Market Hypothesis , 1991 .

[43]  Huan Liu,et al.  Symbolic Representation of Neural Networks , 1996, Computer.

[44]  C. Lee Giles,et al.  Extraction of rules from discrete-time recurrent neural networks , 1996, Neural Networks.

[45]  Vra Krkov Kolmogorov's Theorem Is Relevant , 1991, Neural Computation.

[46]  Padhraic Smyth,et al.  Discrete recurrent neural networks for grammatical inference , 1994, IEEE Trans. Neural Networks.

[47]  Michael C. Mozer,et al.  Rule Induction through Integrated Symbolic and Subsymbolic Processing , 1991, NIPS.

[48]  E. Fama EFFICIENT CAPITAL MARKETS: A REVIEW OF THEORY AND EMPIRICAL WORK* , 1970 .

[49]  F. Eugene FAMA, . The Behavior of Stock-Market Prices, Journal of Business, , . , 1965 .

[50]  Yoshua Bengio,et al.  Neural networks for speech and sequence recognition , 1996 .

[51]  A. Refenes Neural Networks in the Capital Markets , 1994 .

[52]  Michael C. Mozer,et al.  Template-Based Algorithms for Connectionist Rule Extraction , 1994, NIPS.

[53]  Jude W. Shavlik,et al.  Extracting refined rules from knowledge-based neural networks , 2004, Machine Learning.

[54]  C. Lee Giles,et al.  Learning and Extracting Finite State Automata with Second-Order Recurrent Neural Networks , 1992, Neural Computation.

[55]  F. Takens Detecting strange attractors in turbulence , 1981 .

[56]  C. Lee Giles,et al.  Higher Order Recurrent Networks and Grammatical Inference , 1989, NIPS.

[57]  R. Bellman,et al.  V. Adaptive Control Processes , 1964 .

[58]  Hava T. Siegelmann,et al.  On the Computational Power of Neural Nets , 1995, J. Comput. Syst. Sci..

[59]  Věra Kůrková Kolmogorov's theorem , 1998 .

[60]  D. Rumelhart,et al.  Predicting sunspots and exchange rates with connectionist networks , 1991 .

[61]  Andreas S. Weigend,et al.  Time Series Prediction: Forecasting the Future and Understanding the Past , 1994 .