Source coding exponents for zero-delay coding with finite memory

Fundamental limits on the source coding exponents (or large deviations performance) of zero-delay finite-memory (ZDFM) lossy source codes are studied. Our main results are the following. For any memoryless source, a suitably designed encoder that time-shares (at most two) memoryless scalar quantizers is as good as any time-varying fixed-rate ZDFM code, in that it can achieve the fastest exponential rate of decay for the probability of excess distortion. A dual result is shown to apply to the probability of excess code length, among all fixed-distortion ZDFM codes with variable rate. Finally, it is shown that if the scope is broadened to ZDFM codes with variable rate and variable distortion, then a time-invariant entropy-coded memoryless quantizer (without time sharing) is asymptotically optimal under a "fixed-slope" large-deviations criterion (introduced and motivated here in detail) corresponding to a linear combination of the code length and the distortion. These results also lead to single-letter characterizations for the source coding error exponents of ZDFM codes.

[1]  Rüdiger Kiesel,et al.  A Large Deviation Principle for Weighted Sums of Independent Identically Distributed Random Variables , 2000 .

[2]  Tsachy Weissman,et al.  Tradeoffs between the excess-code-length exponent and the excess-distortion exponent in lossy source coding , 2002, IEEE Trans. Inf. Theory.

[3]  N. THOMAS GAARDER,et al.  On optimal finite-state digital transmission systems , 1982, IEEE Trans. Inf. Theory.

[4]  Amir Dembo,et al.  Large Deviations Techniques and Applications , 1998 .

[5]  Tamás Linder,et al.  A zero-delay sequential scheme for lossy coding of individual sequences , 2001, IEEE Trans. Inf. Theory.

[6]  S. P. Lloyd Rate vs fidelity for the binary source , 1977, The Bell System Technical Journal.

[7]  Katalin Marton,et al.  Error exponent for source coding with a fidelity criterion , 1974, IEEE Trans. Inf. Theory.

[8]  Tsachy Weissman,et al.  On limited-delay lossy coding and filtering of individual sequences , 2002, IEEE Trans. Inf. Theory.

[9]  Ioannis Kontoyiannis,et al.  An implementable lossy version of the Lempel-Ziv algorithm - Part I: Optimality for memoryless sources , 1999, IEEE Trans. Inf. Theory.

[10]  David L. Neuhoff,et al.  Causal source codes , 1982, IEEE Trans. Inf. Theory.

[11]  N. Fisher,et al.  Probability Inequalities for Sums of Bounded Random Variables , 1994 .

[12]  Ram Zamir,et al.  Causal source coding of stationary sources with high resolution , 2001, Proceedings. 2001 IEEE International Symposium on Information Theory (IEEE Cat. No.01CH37252).

[13]  Toby Berger,et al.  On binary sliding block codes , 1977, IEEE Trans. Inf. Theory.

[14]  W. Hoeffding Probability Inequalities for sums of Bounded Random Variables , 1963 .

[15]  R. K. Gilbert,et al.  BOUNDS TO THE PERFORMANCE OF CAUSAL CODES FOR MARKOV SOURCES. , 1979 .

[16]  Philippe Piret Causal sliding block encoders with feedback (Corresp.) , 1979, IEEE Trans. Inf. Theory.

[17]  Irvin G. Stiglitz,et al.  A coding theorem for a class of unknown channels , 1967, IEEE Trans. Inf. Theory.

[18]  Ioannis Kontoyiannis,et al.  Pointwise redundancy in lossy data compression and universal lossy data compression , 2000, IEEE Trans. Inf. Theory.

[19]  Shunsuke Ihara Error Exponent for Coding of Memoryless Gaussian Sources with a Fidelity Criterion , 2000 .

[20]  Zhen Zhang,et al.  On the Redundancy of Lossy Source Coding with Abstract Alphabets , 1999, IEEE Trans. Inf. Theory.

[21]  A. Dembo,et al.  The asymptotics of waiting times between stationary processes , 1999 .

[22]  Robert M. Gray,et al.  Sliding-block source coding , 1975, IEEE Trans. Inf. Theory.