Fixed rate universal block source coding with a fidelity criterion
暂无分享,去创建一个
[1] William Feller,et al. An Introduction to Probability Theory and Its Applications , 1951 .
[2] A. Yaglom,et al. An Introduction to the Theory of Stationary Random Functions , 1963 .
[3] V. Rokhlin. LECTURES ON THE ENTROPY THEORY OF MEASURE-PRESERVING TRANSFORMATIONS , 1967 .
[4] K. Parthasarathy,et al. Probability measures on metric spaces , 1967 .
[5] K. Parthasarathy. PROBABILITY MEASURES IN A METRIC SPACE , 1967 .
[6] T. Kailath. The Divergence and Bhattacharyya Distance Measures in Signal Selection , 1967 .
[7] DAVID J. SAKRISON,et al. The Rate Distortion Function for a Class of Sources , 1969, Inf. Control..
[8] Toby Berger,et al. Rate distortion theory : a mathematical basis for data compression , 1971 .
[9] Jacob Ziv,et al. Coding of sources with unknown statistics-II: Distortion relative to a fidelity criterion , 1972, IEEE Trans. Inf. Theory.
[10] ROBERT M. GRAY. Correction to 'Information Rates of Stationary-Ergodic Finite-Alphabet Sources' , 1973, IEEE Trans. Inf. Theory.
[11] D. Ornstein. An Application of Ergodic Theory to Probability Theory , 1973 .
[12] Lee D. Davisson,et al. Universal noiseless coding , 1973, IEEE Trans. Inf. Theory.
[13] Robert M. Gray,et al. Source coding theorems without the ergodic assumption , 1974, IEEE Trans. Inf. Theory.
[14] Robert M. Gray,et al. The ergodic decomposition of stationary discrete random processes , 1974, IEEE Trans. Inf. Theory.
[15] John C. Kieffer. On the optimum average distortion attainable by fixed-rate coding of a nonergodic source , 1975, IEEE Trans. Inf. Theory.
[16] R. Gray,et al. A Generalization of Ornstein's $\bar d$ Distance with Applications to Information Theory , 1975 .