Rate-Distortion Bounds for High-Resolution Vector Quantization via Gibbs's Inequality

Gibbs's inequality states that the differential entropy of a random variable with probability density function (pdf) $f$ is less than or equal to its cross entropy with any other pdf $g$ defined on the same alphabet, i.e., $h(X)\leq -\mathsf{E}[\log g(X)]$. Using this inequality with a cleverly chosen $g$, we derive a lower bound on the smallest output entropy that can be achieved by quantizing a $d$-dimensional source with given expected $r$th-power distortion. Specialized to the one-dimensional case, and in the limit of vanishing distortion, this lower bound converges to the output entropy achieved by a uniform quantizer, thereby recovering the result by Gish and Pierce that uniform quantizers are asymptotically optimal as the allowed distortion tends to zero. Our lower bound holds for any $d$-dimensional memoryless source that has a pdf and whose differential entropy and R\'enyi information dimension are finite. In contrast to Gish and Pierce, we do not require any additional constraints on the continuity or decay of the source pdf.

[1]  Saburo Tazaki,et al.  Asymptotic performance of block quantizers with difference distortion measures , 1980, IEEE Trans. Inf. Theory.

[2]  Amir Dembo,et al.  The rate-distortion dimension of sets and measures , 1994, IEEE Trans. Inf. Theory.

[3]  Amos Lapidoth,et al.  Capacity bounds via duality with applications to multiple-antenna systems on flat-fading channels , 2003, IEEE Trans. Inf. Theory.

[4]  Tamás Linder,et al.  On the asymptotic tightness of the Shannon lower bound , 1994, IEEE Trans. Inf. Theory.

[5]  A. Rényi On the dimension and entropy of probability distributions , 1959 .

[6]  Tobias Koch,et al.  The Shannon Lower Bound Is Asymptotically Tight , 2015, IEEE Transactions on Information Theory.

[7]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[8]  E. Stein,et al.  Introduction to Fourier Analysis on Euclidean Spaces. , 1971 .

[9]  Noga Alon,et al.  A lower bound on the expected length of one-to-one codes , 1994, IEEE Trans. Inf. Theory.

[10]  Paul L. Zador,et al.  Asymptotic quantization error of continuous signals and the quantization dimension , 1982, IEEE Trans. Inf. Theory.

[11]  T. Koch,et al.  A necessary and sufficient condition for the asymptotic tightness of the Shannon lower bound , 2016 .

[12]  Herbert Gish,et al.  Asymptotically efficient quantizing , 1968, IEEE Trans. Inf. Theory.

[13]  Tamás Linder,et al.  Asymptotic entropy-constrained performance of tessellating and universal randomized lattice quantization , 1994, IEEE Trans. Inf. Theory.

[14]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[15]  Victoria Kostina,et al.  Data compression with low distortion and finite blocklength , 2015, 2015 53rd Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[16]  N. J. A. Sloane,et al.  A lower bound on the average error of vector quantizers , 1985, IEEE Trans. Inf. Theory.

[17]  T. Koch The Shannon Lower Bound is Asymptotically Tight for Sources with Finite Renyi Information Dimension. , 2015 .

[18]  Tamás Linder,et al.  A Lagrangian formulation of Zador's entropy-constrained quantization theorem , 2002, IEEE Trans. Inf. Theory.

[19]  L. Goddard Information Theory , 1962, Nature.

[20]  R. Ash,et al.  Probability and measure theory , 1999 .

[21]  Toby Berger,et al.  Rate distortion theory : a mathematical basis for data compression , 1971 .

[22]  Patrick Billingsley,et al.  Probability and Measure. , 1986 .

[23]  Yihong Wu,et al.  Rényi Information Dimension: Fundamental Limits of Almost Lossless Analog Compression , 2010, IEEE Transactions on Information Theory.

[24]  Tamás Linder,et al.  On the structure of optimal entropy-constrained scalar quantizers , 2002, IEEE Trans. Inf. Theory.