Variable-Length Compression Allowing Errors
暂无分享,去创建一个
[1] Sergio Verdú,et al. Variable-length compression allowing errors (extended) , 2014, ArXiv.
[2] Alexandr A. Borovkov,et al. Limit Theorems of Probability Theory. , 2011 .
[3] Akisato Kimura,et al. Weak variable-length Slepian-Wolf coding with linked encoders for mixed sources , 2004, IEEE Transactions on Information Theory.
[4] Hiroki Koga,et al. Source Coding Using Families of Universal Hash Functions , 2007, IEEE Transactions on Information Theory.
[5] V. V. Petrov. Limit Theorems of Probability Theory: Sequences of Independent Random Variables , 1995 .
[6] Te Sun Han,et al. Weak variable-length source coding , 2000, IEEE Trans. Inf. Theory.
[7] John C. Kieffer. Strong converses in source coding relative to a fidelity criterion , 1991, IEEE Trans. Inf. Theory.
[8] V. Statulevičius,et al. Limit Theorems of Probability Theory , 2000 .
[9] A. J. Viterbi,et al. A Lower Bound on the Expected Length of One-to-one Codes , 1994 .
[10] H. Vincent Poor,et al. Feedback in the Non-Asymptotic Regime , 2011, IEEE Transactions on Information Theory.
[11] Imre Csiszár,et al. Information Theory - Coding Theorems for Discrete Memoryless Systems, Second Edition , 2011 .
[12] Thomas M. Cover,et al. Some equivalences between Shannon entropy and Kolmogorov complexity , 1978, IEEE Trans. Inf. Theory.
[13] Aaron D. Wyner,et al. Coding Theorems for a Discrete Source With a Fidelity CriterionInstitute of Radio Engineers, International Convention Record, vol. 7, 1959. , 1993 .
[14] En-Hui Yang,et al. The redundancy of source coding with a fidelity criterion - Part II: Coding at a fixed rate level with unknown statistics , 2001, IEEE Trans. Inf. Theory.
[15] Hiroki Koga,et al. Information-Spectrum Methods in Information Theory , 2002 .
[16] Hirosuke Yamamoto,et al. Asymptotic properties on codeword lengths of an optimal FV code for general sources , 2005, IEEE Transactions on Information Theory.
[17] Amir Dembo,et al. Critical behavior in lossy source coding , 2000, IEEE Trans. Inf. Theory.
[18] Sergio Verdú,et al. Fixed-Length Lossy Compression in the Finite Blocklength Regime , 2011, IEEE Transactions on Information Theory.
[19] Ioannis Kontoyiannis,et al. Pointwise redundancy in lossy data compression and universal lossy data compression , 2000, IEEE Trans. Inf. Theory.
[20] Wojciech Szpankowski,et al. A One-to-One Code and Its Anti-Redundancy , 2008, IEEE Transactions on Information Theory.
[21] Sergio Verdú,et al. Optimal Lossless Data Compression: Non-Asymptotics and Asymptotics , 2014, IEEE Transactions on Information Theory.
[22] Aaron D. Wyner,et al. An Upper Bound on the Entropy Series , 1972, Inf. Control..
[23] Noga Alon,et al. A lower bound on the expected length of one-to-one codes , 1994, IEEE Trans. Inf. Theory.
[24] Frederick Jelinek,et al. Probabilistic Information Theory: Discrete and Memoryless Models , 1968 .
[25] Wojciech Szpankowski,et al. Minimum Expected Length of Fixed-to-Variable Lossless Compression Without Prefix Constraints , 2011, IEEE Transactions on Information Theory.
[26] H. Vincent Poor,et al. Channel Coding Rate in the Finite Blocklength Regime , 2010, IEEE Transactions on Information Theory.
[27] Oliver Kosut,et al. Asymptotics and Non-Asymptotics for Universal Fixed-to-Variable Source Coding , 2014, IEEE Transactions on Information Theory.
[28] E. Posner,et al. Epsilon entropy of stochastic processes. , 1967 .
[29] V. Erokhin. $\varepsilon $-Entropy of a Discrete Random Variable , 1958 .
[30] Zhen Zhang,et al. On the Redundancy of Lossy Source Coding with Abstract Alphabets , 1999, IEEE Trans. Inf. Theory.
[31] Gou Hosoya,et al. 国際会議参加報告:2014 IEEE International Symposium on Information Theory , 2014 .
[32] Yuval Kochman,et al. The Dispersion of Lossy Source Coding , 2011, 2011 Data Compression Conference.
[33] E. Posner,et al. Epsilon Entropy and Data Compression , 1971 .