Block coding for discrete stationary d -continuous noisy channels
暂无分享,去创建一个
R. Gray | D. Ornstein | R. Gray | D. Ornstein
[1] Jacob Wolfowitz,et al. A Note on the Strong Converse of the Coding Theorem for the General Discrete Finite-Memory Channel , 1960, Inf. Control..
[2] D. Ornstein. An Application of Ergodic Theory to Probability Theory , 1973 .
[3] K. Jacobs. Die Übertragung diskreter Informationen durch periodische und fastperiodische Kanäle , 1959 .
[4] C. E. SHANNON,et al. A mathematical theory of communication , 1948, MOCO.
[5] T. Kailath. The Divergence and Bhattacharyya Distance Measures in Signal Selection , 1967 .
[6] Kinsaku Takano. On the basic theorems of information theory , 1957 .
[7] Robert M. Gray,et al. Ergodic and information theory , 1977 .
[8] R. Gray,et al. A Generalization of Ornstein's $\bar d$ Distance with Applications to Information Theory , 1975 .
[9] David L. Neuhoff,et al. Fixed rate universal block source coding with a fidelity criterion , 1975, IEEE Trans. Inf. Theory.
[10] V. Strassen. The Existence of Probability Measures with Given Marginals , 1965 .
[11] Robert B. Ash,et al. Information Theory , 2020, The SAGE International Encyclopedia of Mass Media and Society.
[12] R. Dobrushin. Prescribing a System of Random Variables by Conditional Distributions , 1970 .
[13] R. Dudley. Distances of Probability Measures and Random Variables , 1968 .
[14] R. Ash,et al. Real analysis and probability , 1975 .
[15] Amiel Feinstein,et al. On the coding theorem and its converse for finite-memory channels , 1959, Inf. Control..
[16] B. McMillan. The Basic Theorems of Information Theory , 1953 .
[17] A. Wyner,et al. Coding Theorem for Stationary, Asymptotically Memoryless, Continuous-time Channels , 1972 .
[18] John C. Kieffer,et al. A General Formula for the Capacity of Stationary Nonanticipatory Channels , 1974, Inf. Control..
[19] Amiel Feinstein,et al. A new basic theorem of information theory , 1954, Trans. IRE Prof. Group Inf. Theory.
[20] K. Jacobs. Über die Struktur der mittleren Entropie , 1962 .
[21] Rudolf Ahlswede,et al. The weak capacity of averaged channels , 1968 .
[22] David L. Neuhoff,et al. Channels with almost finite memory , 1979, IEEE Trans. Inf. Theory.
[23] David L. Neuhoff,et al. Process definitions of distortion-rate functions and source coding theorems , 1975, IEEE Trans. Inf. Theory.
[24] I. P. Tsaregradskii. A Note on the Capacity of a Stationary Channel with Finite Memory , 1958 .
[25] R. Adler. Ergodic and mixing properties of infinite memory channels , 1961 .
[26] Karel Winkelbauer. On the regularity condition for decomposable communication channels , 1971, Kybernetika.
[27] Ernst Pfaffelhuber,et al. Channels with asymptotically decreasing memory and anticipation , 1971, IEEE Trans. Inf. Theory.
[28] S. Muroga. On the Capacity of a Discrete Channel. I Mathematical expression of capacity of a channel which is disturbed by noise in its every one symbol and expressible in one state diagram , 1953 .
[29] P. Shields. The theory of Bernoulli shifts , 1973 .
[30] K. Parthasarathy. On the integral representation of the rate of transmission of a stationary channel , 1961 .
[31] Karel Winkelbauer. On the coding theorem for decomposable discrete information channels. II , 1971, Kybernetika.
[32] Leo Breiman,et al. On achieving channel capacity in finite-memory channels , 1960 .
[33] Rudolf Ahlswede,et al. Two contributions to information theory , 1975 .
[34] R. Ahlswede. Certain results in coding theory for compound channels , 1967 .
[35] R. Gallager. Information Theory and Reliable Communication , 1968 .