Discovering Time-Series Building Blocks Using the ‘Significance Engine’

This paper presents a time-series prediction framework called the “Significance Engine” which contributes to the area of time-series prediction by presenting a novel, noise resistant mechanism that combines together the techniques of Takens' Theorem and dynamic self-organising maps. It is built around the idea that time-series can be broken down into both random and non-random components and that the non-random components can be used to help predict future time-series movement. It is a highly parallel system which combines together multiple Dynamic Self-Organising Map units of the ‘Grow When Required’ type and time delay embedding techniques. It learns iteratively, so the more data the system is fed from the input distribution, the better the predictions it is able to make. After the framework has been initialised and ideally primed using historical data, it is able to recognise re-occurring patterns in highly noisy waveforms and to successfully make future predictions based on historically what occurred when these motifs were previously observed.

[1]  Teuvo Kohonen,et al.  Self-Organizing Maps, Second Edition , 1997, Springer Series in Information Sciences.

[2]  Eamonn J. Keogh,et al.  Finding surprising patterns in a time series database in linear time and space , 2002, KDD.

[3]  James D. Hamilton Time Series Analysis , 1994 .

[4]  J. Murphy Technical Analysis of the Financial Markets , 1999 .

[5]  Victoria J. Hodge,et al.  Hierarchical growing cell structures: TreeGCS , 2000, KES'2000. Fourth International Conference on Knowledge-Based Intelligent Engineering Systems and Allied Technologies. Proceedings (Cat. No.00TH8516).

[6]  Andreas Rauber,et al.  Adaptive Hierarchical Incremental Grid Growing: An architecture for high-dimensional data visualization , 2003 .

[7]  Eamonn J. Keogh,et al.  Mining motifs in massive time series databases , 2002, 2002 IEEE International Conference on Data Mining, 2002. Proceedings..

[8]  Bernd Fritzke Growing Grid — a self-organizing network with constant neighborhood range and adaptation strength , 1995, Neural Processing Letters.

[9]  Y. Wong,et al.  Differentiable Manifolds , 2009 .

[10]  Stephen R. Marsland,et al.  A self-organising network that grows when required , 2002, Neural Networks.

[11]  Thomas Martinetz,et al.  Topology representing networks , 1994, Neural Networks.

[12]  George E. P. Box,et al.  Time Series Analysis: Forecasting and Control , 1977 .

[13]  R. Engle Autoregressive conditional heteroscedasticity with estimates of the variance of United Kingdom inflation , 1982 .

[14]  Bernd Fritzke,et al.  Growing cell structures--A self-organizing network for unsupervised and supervised learning , 1994, Neural Networks.

[15]  Bernd Fritzke,et al.  Unsupervised ontogenic networks , 1997 .

[16]  Kristof Van Laerhoven Combining the Self-Organizing Map and K-Means Clustering for On-Line Classification of Sensor Data , 2001, ICANN.

[17]  F. Takens Detecting strange attractors in turbulence , 1981 .

[18]  C. Malsburg Self-organization of orientation sensitive cells in the striate cortex , 2004, Kybernetik.

[19]  S. Grossberg,et al.  Adaptive pattern classification and universal recoding: I. Parallel development and coding of neural feature detectors , 1976, Biological Cybernetics.

[20]  T. Bollerslev,et al.  Generalized autoregressive conditional heteroskedasticity , 1986 .

[21]  Jonathan D. Cryer,et al.  Time Series Analysis , 1986 .

[22]  P. Young,et al.  Time series analysis, forecasting and control , 1972, IEEE Transactions on Automatic Control.

[23]  Teuvo Kohonen,et al.  Self-organized formation of topologically correct feature maps , 2004, Biological Cybernetics.

[24]  Richard J. Povinelli,et al.  Identifying Temporal Patterns for Characterization and Prediction of Financial Time Series Events , 2000, TSDM.

[25]  Thomas Martinetz,et al.  'Neural-gas' network for vector quantization and its application to time-series prediction , 1993, IEEE Trans. Neural Networks.