The concepts of entropy and dimension as applied to dynamical systems are reviewed from a physical point of view. The information dimension, which measures the rate at which the information contained in a probability density scales with resolution, fills a logical gap in the classification of attractors in terms of metric entropy, fractal dimension, and topological entropy. Several examples are presented of chaotic attractors that have a self similar, geometrically scaling structure in their probability distribution; for these attractors the information dimension and fractal dimension are different. Just as the metric (Kolmogorov-Sinai) entropy places an upper bound on the information gained in a sequence of measurements, the information dimension can be used to estimate the information obtained in an isolated measurement. The metric entropy can be expressed in terms of the information dimension of a probability distribution constructed from a sequence of measurements. An algorithm is presented that allows the experimental determination of the information dimension and metric entropy.
[1]
日本数学会.
Publications of the Mathematical Society of Japan
,
1955
.
[2]
I. Good,et al.
Ergodic theory and information
,
1966
.
[3]
R. Bowen.
Equilibrium States and the Ergodic Theory of Anosov Diffeomorphisms
,
1975
.
[4]
H. Peitgen,et al.
Functional Differential Equations and Approximation of Fixed Points
,
1979
.
[5]
Allan N. Kaufman.
STATISTICAL ANALYSIS OF A DETERMINISTIC STOCHASTIC ORBIT
,
1980
.
[6]
F. Takens.
Detecting strange attractors in turbulence
,
1981
.
[7]
J. Doyne Farmer,et al.
Spectral Broadening of Period-Doubling Bifurcation Sequences
,
1981
.
[8]
Sheldon Goldstein,et al.
Entropy increase in dynamical systems
,
1981
.
[9]
J. D. Farmer,et al.
ON DETERMINING THE DIMENSION OF CHAOTIC FLOWS
,
1981
.