Entropy and Information Theory
暂无分享,去创建一个
[1] G. Birkhoff. Proof of the Ergodic Theorem , 1931, Proceedings of the National Academy of Sciences.
[2] J. Neumann. Zur Operatorenmethode In Der Klassischen Mechanik , 1932 .
[3] Shizuo Kakutani,et al. 131. Induced Measure Preserving Transformations , 1943 .
[4] B. McMillan. The Basic Theorems of Information Theory , 1953 .
[5] Amiel Feinstein,et al. A new basic theorem of information theory , 1954, Trans. IRE Prof. Group Inf. Theory.
[6] P. Halmos. Lectures on ergodic theory , 1956 .
[7] D. Slepian. A class of binary signaling alphabets , 1956 .
[8] L. Breiman. The Individual Ergodic Theorem of Information Theory , 1957 .
[9] P. Elias,et al. Two famous papers (Edtl.) , 1958 .
[10] S. Kullback,et al. Information Theory and Statistics , 1959 .
[11] Amiel Feinstein. On the coding theorem and its converse for finite-memory channels , 1959 .
[12] K. Jacobs. Die Übertragung diskreter Informationen durch periodische und fastperiodische Kanäle , 1959 .
[13] I. G. BONNER CLAPPISON. Editor , 1960, The Electric Power Engineering Handbook - Five Volume Set.
[14] Jacob Wolfowitz,et al. A Note on the Strong Converse of the Coding Theorem for the General Discrete Finite-Memory Channel , 1960, Inf. Control..
[15] Mill Johannes G.A. Van,et al. Transmission Of Information , 1961 .
[16] R. Adler. Ergodic and mixing properties of infinite memory channels , 1961 .
[17] Shu-Teh Chen Moy,et al. Generalizations of Shannon-McMillan theorem , 1961 .
[18] Jacob Wolfowitz. Coding Theorems of Information Theory , 1962 .
[19] K. Jacobs. Über die Struktur der mittleren Entropie , 1962 .
[20] C. Caramanis. What is ergodic theory , 1963 .
[21] Vladimir I. Levenshtein,et al. Binary codes capable of correcting deletions, insertions, and reversals , 1965 .
[22] I. Good,et al. Ergodic theory and information , 1966 .
[23] Toby Berger. Rate Distortion Theory for Sources with Abstract Alphabets and Memory , 1968, Inf. Control..
[24] Elwyn R. Berlekamp,et al. Algebraic coding theory , 1984, McGraw-Hill series in systems science.
[25] R. Gallager. Information Theory and Reliable Communication , 1968 .
[26] T. T. Kadota. Generalization of Feinstein's fundamental lemma (Corresp.) , 1970, IEEE Trans. Inf. Theory.
[27] D. Ornstein. Bernoulli shifts with the same entropy are isomorphic , 1970 .
[28] N. Friedman,et al. Introduction to Ergodic Theory , 1971 .
[29] Rudolf Ahlswede,et al. CHANNELS WITHOUT SYNCHRONIZATION , 1971 .
[30] Richard E. Blahut,et al. Computation of channel capacity and rate-distortion functions , 1972, IEEE Trans. Inf. Theory.
[31] J. Kieffer. A Counterexample to Perez's Generalization of the Shannon-McMillan Theorem , 1973 .
[32] D. Ornstein. An Application of Ergodic Theory to Probability Theory , 1973 .
[33] P. Shields. The theory of Bernoulli shifts , 1973 .
[34] Barry M. Leiner,et al. Bounds on rate-distortion functions for stationary sources and context-dependent fidelity criteria (Corresp.) , 1973, IEEE Trans. Inf. Theory.
[35] Jr. G. Forney,et al. The viterbi algorithm , 1973 .
[36] Robert M. Gray,et al. Source coding theorems without the ergodic assumption , 1974, IEEE Trans. Inf. Theory.
[37] John C. Kieffer,et al. A General Formula for the Capacity of Stationary Nonanticipatory Channels , 1974, Inf. Control..
[38] D. Ornstein. Ergodic theory, randomness, and dynamical systems , 1974 .
[39] S. Varadhan,et al. Asymptotic evaluation of certain Markov process expectations for large time , 1975 .
[40] David L. Neuhoff,et al. Fixed rate universal block source coding with a fidelity criterion , 1975, IEEE Trans. Inf. Theory.
[41] R. Gray,et al. Nonblock Source Coding with a Fidelity Criterion , 1975 .
[42] I. Csiszár. $I$-Divergence Geometry of Probability Distributions and Minimization Problems , 1975 .
[43] John C. Kieffer. On the optimum average distortion attainable by fixed-rate coding of a nonergodic source , 1975, IEEE Trans. Inf. Theory.
[44] David L. Neuhoff,et al. Process definitions of distortion-rate functions and source coding theorems , 1975, IEEE Trans. Inf. Theory.
[45] P. Walters. Ergodic theory: Introductory lectures , 1975 .
[46] R. Gray,et al. A Generalization of Ornstein's $\bar d$ Distance with Applications to Information Theory , 1975 .
[47] Rudolf Ahlswede,et al. Two contributions to information theory , 1975 .
[48] Robert M. Gray,et al. Sliding-block joint source/noisy-channel coding theorems , 1976, IEEE Trans. Inf. Theory.
[49] James R. Brown,et al. Ergodic theory and topological dynamics , 1976 .
[50] K. Sigmund,et al. Ergodic Theory on Compact Spaces , 1976 .
[51] A. Maitra,et al. Integral representations of invariant measures , 1977 .
[52] David L. Neuhoff,et al. Block and sliding-block source coding , 1977, IEEE Trans. Inf. Theory.
[53] F. MacWilliams,et al. The Theory of Error-Correcting Codes , 1977 .
[54] John C. Kieffer. A generalization of the Pursley-Davisson- Mackenthun universal variable-rate coding theorem , 1977, IEEE Trans. Inf. Theory.
[55] Edward C. van der Meulen,et al. A survey of multi-way channels in information theory: 1961-1976 , 1977, IEEE Trans. Inf. Theory.
[56] Robert J. McEliece,et al. The Theory of Information and Coding , 1979 .
[57] Sui Tung,et al. Multiterminal source coding (Ph.D. Thesis abstr.) , 1978, IEEE Trans. Inf. Theory.
[58] James George Dunham. A note on the abstract alphabet block source coding with a fidelity criterion theorem (Corresp.) , 1978, IEEE Trans. Inf. Theory.
[59] John C. Kieffer,et al. A unified approach to weak universal source coding , 1978, IEEE Trans. Inf. Theory.
[60] Aaron D. Wyner,et al. A Definition of Conditional Mutual Information for Arbitrary Ensembles , 1978, Inf. Control..
[61] David L. Neuhoff,et al. Channels with almost finite memory , 1979, IEEE Trans. Inf. Theory.
[62] R. Gray,et al. Robustness of Estimators on Stationary Observations , 1979 .
[63] Andrew J. Viterbi,et al. Principles of Digital Communication and Coding , 1979 .
[64] D. S. Jones,et al. Elementary information theory , 1979 .
[65] John C. Kieffer,et al. Extension of source coding theorems for block codes to sliding-block codes , 1980, IEEE Trans. Inf. Theory.
[66] A. El Gamal,et al. Multiple user information theory , 1980, Proceedings of the IEEE.
[67] R. Gray,et al. Asymptotically Mean Stationary Measures , 1980 .
[68] John C. Kieffer,et al. Block coding for weakly continuous channels , 1981, IEEE Trans. Inf. Theory.
[69] Alan D. Sokal,et al. Existence of compatible families of proper regular conditional probabilities , 1981 .
[70] J. Kieffer,et al. Markov Channels are Asymptotically Mean Stationary , 1981 .
[71] G. Koumoullis. On perfect measures , 1981 .
[72] Michael B. Pursley,et al. Efficient universal noiseless source codes , 1981, IEEE Trans. Inf. Theory.
[73] David L. Neuhoff,et al. Channel Entropy and Primitive Approximation , 1982 .
[74] David L. Neuhoff,et al. Indecomposable finite state channels and primative approximation , 1982, IEEE Trans. Inf. Theory.
[75] David L. Neuhoff,et al. Channel Distances and Representation , 1982, Inf. Control..
[76] John C. Kieffer,et al. Sliding-block coding for weakly continuous channels , 1982, IEEE Trans. Inf. Theory.
[77] David L. Neuhoff,et al. Causal source codes , 1982, IEEE Trans. Inf. Theory.
[78] V. Cuperman,et al. Vector quantization: A pattern-matching technique for speech coding , 1983, IEEE Communications Magazine.
[79] R. Blahut. Theory and practice of error control codes , 1983 .
[80] D. Ornstein,et al. The Shannon-McMillan-Breiman theorem for a class of amenable groups , 1983 .
[81] M. Hassner,et al. Algorithms for sliding block codes - An application of symbolic dynamics to information theory , 1983, IEEE Trans. Inf. Theory.
[82] Robert M. Gray,et al. Block source coding theory for asymptotically mean stationary sources , 1984, IEEE Trans. Inf. Theory.
[83] R. Gray,et al. Vector quantization , 1984, IEEE ASSP Magazine.
[84] S. Varadhan. Large Deviations and Applications , 1984 .
[85] A. Barron. THE STRONG ERGODIC THEOREM FOR DENSITIES: GENERALIZED SHANNON-MCMILLAN-BREIMAN THEOREM' , 1985 .
[86] K. H. Barratt. Digital Coding of Waveforms , 1985 .
[87] Brian H. Marcus,et al. Sofic systems and encoding data , 1985, IEEE Trans. Inf. Theory.
[88] J. Makhoul,et al. Vector quantization in speech coding , 1985, Proceedings of the IEEE.
[89] Robert M. Gray,et al. The design of joint source and channel trellis waveform coders , 1987, IEEE Trans. Inf. Theory.
[90] Robert M. Gray,et al. Ergodicity of Markov channels , 1987, IEEE Trans. Inf. Theory.
[91] Paul C. Shields,et al. The ergodic and entropy theorems revisited , 1987, IEEE Trans. Inf. Theory.
[92] G.G. Langdon,et al. Data compression , 1988, IEEE Potentials.
[93] James A. Bucklew,et al. A large deviation theory proof of the abstract alphabet source coding theorem , 1988, IEEE Trans. Inf. Theory.
[94] T. Cover,et al. A sandwich proof of the Shannon-McMillan-Breiman theorem , 1988 .
[95] R. Gray. Source Coding Theory , 1989 .
[96] J. Kieffer. An ergodic theorem for constrained sequences of functions , 1989 .
[97] Robert M. Gray,et al. Spectral analysis of quantization noise in a single-loop sigma-delta modulator with DC input , 1989, IEEE Trans. Commun..
[98] P. Gács,et al. KOLMOGOROV'S CONTRIBUTIONS TO INFORMATION THEORY AND ALGORITHMIC COMPLEXITY , 1989 .
[99] John C. Kieffer,et al. Sample converses in source coding theory , 1991, IEEE Trans. Inf. Theory.
[100] Shirley Dex,et al. JR 旅客販売総合システム(マルス)における運用及び管理について , 1991 .
[101] Allen Gersho,et al. Vector quantization and signal compression , 1991, The Kluwer international series in engineering and computer science.