Abstract:

The concepts of entropy and dimension as applied to dynamical systems are reviewed from a physical point of view. The information dimension, which measure the rate at which the information contained in a probability density scales with resolution, fills a logical gap in the classification of attractors in terms of metric entropy, fractal dimension, and topological entropy. Several examples are presented of chaotic attractors that have a self similar, geometrically scaling structure in their probability distribution; for these attractors the information dimension and factual dimension are different. Just as the metric (Kolmogorov-Sinai) entropy places an upper bound on the information gained in a sequence of measurements, the information dimension can be used to estimate the information obtained in an isolated measurement. The metric entropy can be expressed in terms of the information dimension of a probability distribution constructed from a sequence of measurements. An algorithm is presented that allows the experimental determination of the information dimension and metric entropy.

Citation:

Farmer, J.D. (1982).'Information Dimension and the Probabilistic Structure of Chaos.' Z. Naturforsch, 37A, pp.1304-1325.
Download Document (pdf, 1.779 MB)

Authors

Research Programmes

Type