Information Dimension and the Probabilistic Structure of Chaos

The concepts of entropy and dimension as applied to dynamical systems are reviewed from a physical point of view. The information dimension, which measures the rate at which the information contained in a probability density scales with resolution, fills a logical gap in the classification of attractors in terms of metric entropy, fractal dimension, and topological entropy. Several examples are presented of chaotic attractors that have a self similar, geometrically scaling structure in their probability distribution; for these attractors the information dimension and fractal dimension are different. Just as the metric (Kolmogorov-Sinai) entropy places an upper bound on the information gained in a sequence of measurements, the information dimension can be used to estimate the information obtained in an isolated measurement. The metric entropy can be expressed in terms of the information dimension of a probability distribution constructed from a sequence of measurements. An algorithm is presented that allows the experimental determination of the information dimension and metric entropy.