Data compression with low distortion and finite blocklength

This paper considers lossy source coding of n-dimensional continuous memoryless sources with low mean-square error distortion and shows a simple, explicit approximation to the minimum source coding rate. More precisely, a nonasymptotic version of Shannon's lower bound is presented. Lattice quantizers are shown to approach that lower bound, provided that the source density is smooth enough and the distortion is low, which implies that fine multidimensional lattice coverings are nearly optimal in the rate-distortion sense even at finite n. The achievability proof technique avoids both the usual random coding argument and the simplifying assumption of the presence of a dither signal.

[1]  R. Gallager Information Theory and Reliable Communication , 1968 .

[2]  John T. Pinkston An application of rate-distortion theory to a converse to the coding theorem , 1969, IEEE Trans. Inf. Theory.

[3]  Meir Feder,et al.  On lattice quantization noise , 1996, IEEE Trans. Inf. Theory.

[4]  E. Posner,et al.  Epsilon entropy of stochastic processes. , 1967 .

[5]  Tamás Linder,et al.  On the asymptotic tightness of the Shannon lower bound , 1994, IEEE Trans. Inf. Theory.

[6]  C. A. Rogers,et al.  Packing and Covering , 1964 .

[7]  Tobias Koch,et al.  Rate-Distortion Bounds for High-Resolution Vector Quantization via Gibbs's Inequality , 2015, ArXiv.

[8]  Meir Feder,et al.  On universal quantization by randomized uniform/lattice quantizers , 1992, IEEE Trans. Inf. Theory.

[9]  T. Koch The Shannon Lower Bound is Asymptotically Tight for Sources with Finite Renyi Information Dimension. , 2015 .

[10]  Yihong Wu,et al.  Wasserstein Continuity of Entropy and Outer Bounds for Interference Channels , 2015, IEEE Transactions on Information Theory.

[11]  Thomas M. Cover,et al.  Elements of information theory (2. ed.) , 2006 .

[12]  R. Zamir,et al.  Lattice Coding for Signals and Networks: A Structured Coding Approach to Quantization, Modulation and Multiuser Information Theory , 2014 .

[13]  Noga Alon,et al.  A lower bound on the expected length of one-to-one codes , 1994, IEEE Trans. Inf. Theory.

[14]  A. J. Viterbi,et al.  A Lower Bound on the Expected Length of One-to-one Codes , 1994 .

[15]  V. Erokhin $\varepsilon $-Entropy of a Discrete Random Variable , 1958 .

[16]  E. Çinlar Probability and Stochastics , 2011 .

[17]  Paul L. Zador,et al.  Asymptotic quantization error of continuous signals and the quantization dimension , 1982, IEEE Trans. Inf. Theory.

[18]  W. R. Bennett,et al.  Spectra of quantized signals , 1948, Bell Syst. Tech. J..

[19]  Robert M. Gray,et al.  Information rates of autoregressive processes , 1970, IEEE Trans. Inf. Theory.

[20]  A. Rényi On the dimension and entropy of probability distributions , 1959 .

[21]  Liyao Wang,et al.  Optimal Concentration of Information Content For Log-Concave Densities , 2015, ArXiv.

[22]  Tobias Koch,et al.  The Shannon Lower Bound Is Asymptotically Tight , 2015, IEEE Transactions on Information Theory.

[23]  L. Palzer,et al.  A converse for lossy source coding in the finite blocklength regime , 2016 .

[24]  Tamás Linder,et al.  Asymptotic entropy-constrained performance of tessellating and universal randomized lattice quantization , 1994, IEEE Trans. Inf. Theory.

[25]  Aaron D. Wyner,et al.  Coding Theorems for a Discrete Source With a Fidelity CriterionInstitute of Radio Engineers, International Convention Record, vol. 7, 1959. , 1993 .

[26]  Allen Gersho,et al.  Vector quantization and signal compression , 1991, The Kluwer international series in engineering and computer science.

[27]  Sergio Verdú,et al.  Fixed-Length Lossy Compression in the Finite Blocklength Regime , 2011, IEEE Transactions on Information Theory.

[28]  Fuzhen Zhang Matrix Theory: Basic Results and Techniques , 1999 .

[29]  Hassler Whitney,et al.  Complex Analytic Varieties , 1972 .

[30]  Robert M. Gray Information rates of stationary ergodic finite-alphabet sources , 1971, IEEE Trans. Inf. Theory.

[31]  T. Koch,et al.  Converse Bounds for Entropy-Constrained Quantization Via a Variational Entropy Inequality , 2015 .

[32]  N. J. A. Sloane,et al.  Sphere Packings, Lattices and Groups , 1987, Grundlehren der mathematischen Wissenschaften.

[33]  Peter M. Schultheiss,et al.  Information rates of non-Gaussian processes , 1964, IEEE Trans. Inf. Theory.

[34]  Gene H. Golub,et al.  Matrix computations , 1983 .

[35]  Allen Gersho,et al.  Asymptotically optimal block quantization , 1979, IEEE Trans. Inf. Theory.

[36]  V. V. Petrov Limit Theorems of Probability Theory: Sequences of Independent Random Variables , 1995 .

[37]  W. Fischer,et al.  Sphere Packings, Lattices and Groups , 1990 .

[38]  C. Shannon Probability of error for optimal codes in a Gaussian channel , 1959 .

[39]  Gábor Fejes Tóth,et al.  Packing and Covering , 2004, Handbook of Discrete and Computational Geometry, 2nd Ed..

[40]  N. J. A. Sloane,et al.  Fast quantizing and decoding and algorithms for lattice quantizers and codes , 1982, IEEE Trans. Inf. Theory.

[41]  Aaron D. Wyner,et al.  An Upper Bound on the Entropy Series , 1972, Inf. Control..

[42]  H. Vincent Poor,et al.  Channel Coding Rate in the Finite Blocklength Regime , 2010, IEEE Transactions on Information Theory.

[43]  Victoria Kostina,et al.  Data Compression With Low Distortion and Finite Blocklength , 2017, IEEE Transactions on Information Theory.

[44]  Thomas C. Hales Sphere packings, I , 1997, Discret. Comput. Geom..

[45]  Richard E. Blahut,et al.  Computation of channel capacity and rate-distortion functions , 1972, IEEE Trans. Inf. Theory.

[46]  Yihong Wu,et al.  Rényi Information Dimension: Fundamental Limits of Almost Lossless Analog Compression , 2010, IEEE Transactions on Information Theory.

[47]  T. Berger Rate-Distortion Theory , 2003 .