We present a technique to determine the embedding dimension of deterministic time series. The technique is based mainly on the assumption that if a continuous deterministic time series, with first derivative and underlying model f/sub S/(x/sub t-1/, ..., x/sub t-m/)=x/sub t/ is represented on its convenient embedding dimension /spl Rfr//sup m/, then any two neighbouring input observations x/sub m//sup i/ and x/sub m//sup j/, should correspond to similar output observations x/sub t//sup i/ and x/sub t//sup j/. Thus the ratio between the input and output distances of neighbouring observations is used as a criterion to determine the time series embedding dimension. The performance of this technique is illustrated and compared using synthetic time series (Logistic map, Henon map and Mackey Glass function), as well as the Laser data from the Santa Fe competition. Our technique applied to the Laser data, in combination with a Limited Resource Allocating Neural Network (LRAN), has proved to be as successful as the techniques proposed by the winners of the competition on the 100-step prediction of this chaotic time series.
[1]
R. Savit,et al.
Time series and dependent variables
,
1991
.
[2]
Antanas Cenys,et al.
Estimation of the number of degrees of freedom from chaotic time series
,
1988
.
[3]
Andreas S. Weigend,et al.
Time Series Prediction: Forecasting the Future and Understanding the Past
,
1994
.
[4]
Carsten Peterson,et al.
Finding the Embedding Dimension and Variable Dependencies in Time Series
,
1994,
Neural Computation.
[5]
Mahesan Niranjan,et al.
Pruning with Replacement on Limited Resource Allocating Networks by F-Projections
,
1996,
Neural Computation.
[6]
John C. Platt.
A Resource-Allocating Network for Function Interpolation
,
1991,
Neural Computation.