An on-line universal lossy data compression algorithm via continuous codebook refinement - Part II. Optimality for phi-mixing source models
暂无分享,去创建一个
[1] Robert M. Gray,et al. Ergodicity of Markov channels , 1987, IEEE Trans. Inf. Theory.
[2] P. Gács,et al. Bounds on conditional probabilities with applications in multi-user communication , 1976 .
[3] P. Billingsley,et al. Convergence of Probability Measures , 1969 .
[4] D. Halverson,et al. Discrete-time detection in Epsilon -mixing noise , 1980, IEEE Trans. Inf. Theory.
[5] R. Gray,et al. Asymptotically Mean Stationary Measures , 1980 .
[6] J. Kieffer,et al. Markov Channels are Asymptotically Mean Stationary , 1981 .
[7] Marius Iosifescu,et al. Finite Markov Processes and Their Applications , 1981 .
[8] Neri Merhav,et al. Estimating the number of states of a finite-state source , 1992, IEEE Trans. Inf. Theory.
[9] R. Gray. Entropy and Information Theory , 1990, Springer New York.
[10] John C. Kieffer,et al. Sample converses in source coding theory , 1991, IEEE Trans. Inf. Theory.
[11] Hossam M. H. Shalaby,et al. Error exponents for distributed detection of Markov sources , 1994, IEEE Trans. Inf. Theory.
[12] En-Hui Yang,et al. Simple universal lossy data compression schemes derived from the Lempel-Ziv algorithm , 1996, IEEE Trans. Inf. Theory.
[13] P. C. Shields,et al. Ergodic Processes And Zero Divergence , 1991, Proceedings. 1991 IEEE International Symposium on Information Theory.
[14] Tamás Linder,et al. Rates of convergence in the source coding theorem, in empirical quantizer design, and in universal lossy source coding , 1994, IEEE Trans. Inf. Theory.
[15] Katalin Marton,et al. A simple proof of the blowing-up lemma , 1986, IEEE Trans. Inf. Theory.
[16] Zhen Zhang,et al. An on-line universal lossy data compression algorithm via continuous codebook refinement - Part I: Basic results , 1996, IEEE Trans. Inf. Theory.