The Shannon Lower Bound Is Asymptotically Tight
暂无分享,去创建一个
[1] Saburo Tazaki,et al. Asymptotic performance of block quantizers with difference distortion measures , 1980, IEEE Trans. Inf. Theory.
[2] Helmut Bölcskei,et al. Degrees of Freedom in Vector Interference Channels , 2012, IEEE Transactions on Information Theory.
[3] R. Ash,et al. Probability and measure theory , 1999 .
[4] R. Durrett. Probability: Theory and Examples , 1993 .
[5] Imre Csiszár,et al. Some remarks on the dimension and entropy of random variables , 1964 .
[6] R. Gray. Entropy and Information Theory , 1990, Springer New York.
[7] Charles R. Johnson,et al. Matrix analysis , 1985, Statistical Inference for Engineers and Data Scientists.
[8] Tobias Koch,et al. A general rate-distortion converse bound for entropy-constrained scalar quantization , 2016, 2016 IEEE International Symposium on Information Theory (ISIT).
[9] Aaron D. Wyner,et al. Coding Theorems for a Discrete Source With a Fidelity CriterionInstitute of Radio Engineers, International Convention Record, vol. 7, 1959. , 1993 .
[10] Imre Csisz. Arbitrarily Varying Channels with General Alphabets and States , 1992 .
[11] Robert M. Gray,et al. Entropy and Information Theory -2/E. , 2014 .
[12] Tobias Koch,et al. Rate-Distortion Bounds for High-Resolution Vector Quantization via Gibbs's Inequality , 2015, ArXiv.
[13] Tamás Linder,et al. On the asymptotic tightness of the Shannon lower bound , 1994, IEEE Trans. Inf. Theory.
[14] Toby Berger,et al. Rate distortion theory : a mathematical basis for data compression , 1971 .
[15] Victoria Kostina,et al. Data compression with low distortion and finite blocklength , 2015, 2015 53rd Annual Allerton Conference on Communication, Control, and Computing (Allerton).
[16] Tamás Linder,et al. A Lagrangian formulation of Zador's entropy-constrained quantization theorem , 2002, IEEE Trans. Inf. Theory.
[17] P. Billingsley,et al. Probability and Measure , 1980 .
[18] Thomas M. Cover,et al. Elements of Information Theory , 2005 .
[19] Herbert Gish,et al. Asymptotically efficient quantizing , 1968, IEEE Trans. Inf. Theory.
[20] David L. Neuhoff,et al. Quantization , 2022, IEEE Trans. Inf. Theory.
[21] Paul L. Zador,et al. Asymptotic quantization error of continuous signals and the quantization dimension , 1982, IEEE Trans. Inf. Theory.
[22] Charles R. Johnson,et al. Topics in Matrix Analysis , 1991 .
[23] Peter M. Schultheiss,et al. Information rates of non-Gaussian processes , 1964, IEEE Trans. Inf. Theory.
[24] Amir Dembo,et al. The rate-distortion dimension of sets and measures , 1994, IEEE Trans. Inf. Theory.
[25] Amos Lapidoth,et al. Capacity bounds via duality with applications to multiple-antenna systems on flat-fading channels , 2003, IEEE Trans. Inf. Theory.
[26] Yihong Wu,et al. Rényi Information Dimension: Fundamental Limits of Almost Lossless Analog Compression , 2010, IEEE Transactions on Information Theory.
[27] P. Rousseeuw,et al. Wiley Series in Probability and Mathematical Statistics , 2005 .
[28] A. Rényi. On the dimension and entropy of probability distributions , 1959 .
[29] JACOB BINIA,et al. On the Epsilon -entropy and the rate-distortion function of certain non-Gaussian processes , 1974, IEEE Trans. Inf. Theory.
[30] Imre Csiszár,et al. Arbitrarily varying channels with general alphabets and states , 1992, IEEE Trans. Inf. Theory.