Simulation of random processes and rate-distortion theory

We study the randomness necessary for the simulation of a random process with given distributions, on terms of the finite-precision resolvability of the process. Finite-precision resolvability is defined as the minimal random-bit rate required by the simulator as a function of the accuracy with which the distributions are replicated. The accuracy is quantified by means of various measures: variational distance, divergence, Orstein (1973), Prohorov (1956) and related measures of distance between the distributions of random process. In the case of Ornstein, Prohorov and other distances of the Kantorovich-Vasershtein type, we show that the finite-precision resolvability is equal to the rate-distortion function with a fidelity criterion derived from the accuracy measure. This connection leads to new results on nonstationary rate-distortion theory. In the case of variational distance, the resolvability of stationary ergodic processes is shown to equal entropy rate regardless of the allowed accuracy. In the case of normalized divergence, explicit expressions for finite-precision resolvability are obtained in many cases of interest; and connections with data compression with minimum probability of block error are shown.

[1]  John B. Shoven,et al.  I , Edinburgh Medical and Surgical Journal.

[2]  Robert M. Gray,et al.  Source coding theorems without the ergodic assumption , 1974, IEEE Trans. Inf. Theory.

[3]  F. Hampel A General Qualitative Definition of Robustness , 1971 .

[4]  I. Csiszár Information Theory , 1981 .

[5]  Sergio Verdú,et al.  Channel simulation and coding with side information , 1994, IEEE Trans. Inf. Theory.

[6]  I. Vajda Theory of statistical inference and information , 1989 .

[7]  John C. Kieffer Strong converses in source coding relative to a fidelity criterion , 1991, IEEE Trans. Inf. Theory.

[8]  Sergio Verdú,et al.  Approximation theory of output statistics , 1993, IEEE Trans. Inf. Theory.

[9]  R. Gray,et al.  Robustness of Estimators on Stationary Observations , 1979 .

[10]  John C. Kieffer On the optimum average distortion attainable by fixed-rate coding of a nonergodic source , 1975, IEEE Trans. Inf. Theory.

[11]  R. Gray,et al.  Block coding for discrete stationary d -continuous noisy channels , 1979, IEEE Trans. Inf. Theory.

[12]  L. Davisson,et al.  The Distortion-Rate Function for Nonergodic Sources , 1978 .

[13]  D. A. Edwards On the existence of probability measures with given marginals , 1978 .

[14]  Yu. V. Prokhorov Convergence of Random Processes and Limit Theorems in Probability Theory , 1956 .

[15]  R. Gray,et al.  A Generalization of Ornstein's $\bar d$ Distance with Applications to Information Theory , 1975 .

[16]  D. Ornstein An Application of Ergodic Theory to Probability Theory , 1973 .

[17]  R. Gray Entropy and Information Theory , 1990, Springer New York.