Mutual information rate, distortion, and quantization in metric spaces

Several new properties as well as simplified proofs of known properties are developed for the mutual information rate between discrete-time random processes whose alphabets are Borel subsets of complete separable metric spaces. In particular, the asymptotic properties of quantizers for such spaces provide a fink with finite-alphabet processes and yield the ergodic decomposition of mutual information rate. This result is used to prove the equality of stationary and ergodic process distortion-rate functions with the usual distortion-rate function. An unusual definition of mutual information rate for continuous-alphabet processes is used, but it is shown to be operationally appropriate and more useful mathematically; it provides an intuitive link between continuous-alphabet and finite-alphabet processes, and it allows generalizations of some fundamental results of ergodic theory that are useful for information theory.

[1]  R. Ash,et al.  Real analysis and probability , 1975 .

[2]  Ya. Sinai,et al.  FLOWS WITH FINITE ENTROPY , 1959 .

[3]  Toby Berger,et al.  Rate distortion theory : a mathematical basis for data compression , 1971 .

[4]  P. Billingsley,et al.  Ergodic theory and information , 1966 .

[5]  Robert M. Gray,et al.  Source coding theorems without the ergodic assumption , 1974, IEEE Trans. Inf. Theory.

[6]  V. Rokhlin NEW PROGRESS IN THE THEORY OF TRANSFORMATIONS WITH INVARIANT MEASURE , 1960 .

[7]  David L. Neuhoff Source coding and distance measures on random processes (Ph.D. Thesis abstr.) , 1975, IEEE Trans. Inf. Theory.

[8]  K. Parthasarathy On the integral representation of the rate of transmission of a stationary channel , 1961 .

[9]  Jacob Ziv,et al.  Coding of sources with unknown statistics-II: Distortion relative to a fidelity criterion , 1972, IEEE Trans. Inf. Theory.

[10]  Leo Breiman,et al.  On achieving channel capacity in finite-memory channels , 1960 .

[11]  J. Kieffer A SIMPLE PROOF OF THE MOY-PEREZ GENERALIZATION OF THE SHANNON-MCMILLAN THEOREM , 1974 .

[12]  R. G. Gallager,et al.  Coding of Sources With Unknown Statistics- Part II: Distortion Relative to a Fidelity Criterion , 1972 .

[13]  K. Jacobs Über die Struktur der mittleren Entropie , 1962 .

[14]  David L. Neuhoff,et al.  Process definitions of distortion-rate functions and source coding theorems , 1975, IEEE Trans. Inf. Theory.

[15]  Michael B. Pursley,et al.  Source Coding Theorems for Stationary, Continuous-time Stochastic Processes , 1977 .

[16]  David L. Neuhoff,et al.  Fixed rate universal block source coding with a fidelity criterion , 1975, IEEE Trans. Inf. Theory.

[17]  DAVID J. SAKRISON,et al.  The Rate Distortion Function for a Class of Sources , 1969, Inf. Control..

[18]  John C. Kieffer,et al.  Extension of source coding theorems for block codes to sliding-block codes , 1980, IEEE Trans. Inf. Theory.