A First Course in Information Theory
暂无分享,去创建一个
[1] James Richard Roche. Distributed information storage , 1992 .
[2] William Feller,et al. An Introduction to Probability Theory and Its Applications , 1967 .
[3] Zhen Zhang,et al. A class of non-Shannon-type information inequalities and their applications , 2001, Commun. Inf. Syst..
[4] Richard E. Blahut,et al. Computation of channel capacity and rate-distortion functions , 1972, IEEE Trans. Inf. Theory.
[5] Adi Shamir,et al. How to share a secret , 1979, CACM.
[6] Robert G. Gallager,et al. A simple derivation of the coding theorem and some applications , 1965, IEEE Trans. Inf. Theory.
[7] Bruno O. Shubert,et al. Random variables and stochastic processes , 1979 .
[8] Amiel Feinstein,et al. Information and information stability of random variables and processes , 1964 .
[9] Abraham Lempel,et al. Compression of individual sequences via variable-rate coding , 1978, IEEE Trans. Inf. Theory.
[10] Raymond W. Yeung,et al. A framework for linear information inequalities , 1997, IEEE Trans. Inf. Theory.
[11] Ming Li,et al. An Introduction to Kolmogorov Complexity and Its Applications , 2019, Texts in Computer Science.
[12] John B. Anderson,et al. Source and Channel Coding: An Algorithmic Approach , 1991 .
[13] Te Sun Han. Nonnegative Entropy Measures of Multivariate Symmetric Correlations , 1978, Inf. Control..
[14] A. Kolmogorov. Three approaches to the quantitative definition of information , 1968 .
[15] James G. Oxley,et al. Matroid theory , 1992 .
[16] Brendan J. Frey,et al. Factor graphs and the sum-product algorithm , 2001, IEEE Trans. Inf. Theory.
[17] Masud Mansuripur,et al. Introduction to information theory , 1986 .
[18] Zhen Zhang,et al. Distributed Source Coding for Satellite Communications , 1999, IEEE Trans. Inf. Theory.
[19] Dwijendra K. Ray-Chaudhuri,et al. Binary mixture flow with free energy lattice Boltzmann methods , 2022, arXiv.org.
[20] R.W. Yeung,et al. On factorization of positive functions , 2001, Proceedings. 2001 IEEE International Symposium on Information Theory (IEEE Cat. No.01CH37252).
[21] Yasuichi Horibe. An Improved Bound for Weight-Balanced Tree , 1977, Inf. Control..
[22] En-Hui Yang,et al. Efficient universal lossless data compression algorithms based on a greedy sequential grammar transform - Part one: Without context models , 2000, IEEE Trans. Inf. Theory.
[23] Richard D. Gitlin,et al. Diversity coding for transparent self-healing and fault-tolerant communication networks , 1993, IEEE Trans. Commun..
[24] Frans M. J. Willems,et al. The context-tree weighting method: basic properties , 1995, IEEE Trans. Inf. Theory.
[25] L. P. Hyvarinen,et al. Information theory for systems engineers , 1968 .
[26] A. Glavieux,et al. Near Shannon limit error-correcting coding and decoding: Turbo-codes. 1 , 1993, Proceedings of ICC '93 - IEEE International Conference on Communications.
[27] Thomas M. Cover,et al. An algorithm for maximizing expected log investment return , 1984, IEEE Trans. Inf. Theory.
[28] László Lovász,et al. On the Shannon capacity of a graph , 1979, IEEE Trans. Inf. Theory.
[29] J. A. Bondy,et al. Graph Theory with Applications , 1978 .
[30] Silviu Guiaşu,et al. Information theory with applications , 1977 .
[31] Allen Gersho,et al. Vector quantization and signal compression , 1991, The Kluwer international series in engineering and computer science.
[32] Raymond W. Yeung,et al. Some basic properties of fix-free codes , 2001, IEEE Trans. Inf. Theory.
[33] Aaron D. Wyner,et al. The rate-distortion function for source coding with side information at the decoder , 1976, IEEE Trans. Inf. Theory.
[34] Suguru Arimoto,et al. An algorithm for computing the capacity of arbitrary discrete memoryless channels , 1972, IEEE Trans. Inf. Theory.
[35] Michel Mouchart,et al. Discussion on "Conditional independence in statistitical theory" by A.P. Dawid , 1979 .
[36] A. Dawid. Conditional Independence in Statistical Theory , 1979 .
[37] Brockway McMillan,et al. Two inequalities implied by unique decipherability , 1956, IRE Trans. Inf. Theory.
[38] Edward C. van der Meulen,et al. Some Reflections On The Interference Channel , 1994 .
[39] Terence Chan. A combinatorial approach to information inequalities , 2001, Commun. Inf. Syst..
[40] Suguru Arimoto,et al. On the converse to the coding theorem for discrete memoryless channels (Corresp.) , 1973, IEEE Trans. Inf. Theory.
[41] Shunsuke Ihara,et al. Information theory - for continuous systems , 1993 .
[42] Kenneth Rose,et al. A mapping approach to rate-distortion computation and analysis , 1994, IEEE Trans. Inf. Theory.
[43] Jan C. van der Lubbe,et al. Information theory , 1997 .
[44] Claude E. Shannon,et al. The Mathematical Theory of Communication , 1950 .
[45] Zhen Zhang,et al. On Characterization of Entropy Function via Information Inequalities , 1998, IEEE Trans. Inf. Theory.
[46] Shun-ichi Amari,et al. Differential-geometrical methods in statistics , 1985 .
[47] Solomon W. Golomb,et al. Basic Concepts in Information Theory and Coding: The Adventures of Secret Agent 00111 , 1994 .
[48] Philip M. Woodward,et al. Probability and Information Theory with Applications to Radar , 1954 .
[49] Amiel Feinstein,et al. A new basic theorem of information theory , 1954, Trans. IRE Prof. Group Inf. Theory.
[50] Raymond W. Yeung,et al. On a relation between information inequalities and group theory , 2002, IEEE Trans. Inf. Theory.
[51] D. Huffman. A Method for the Construction of Minimum-Redundancy Codes , 1952 .
[52] Shlomo Shamai,et al. The empirical distribution of good codes , 1997, IEEE Trans. Inf. Theory.
[53] Vahid Tarokh,et al. Existence of optimal prefix codes for infinite source alphabets , 1997, IEEE Trans. Inf. Theory.
[54] Robert B. Ash,et al. Information Theory , 2020, The SAGE International Encyclopedia of Mass Media and Society.
[55] Michael O. Rabin,et al. Efficient dispersal of information for security, load balancing, and fault tolerance , 1989, JACM.
[56] Nikolai K. Vereshchagin,et al. Combinatorial interpretation of Kolmogorov complexity , 2000, Proceedings 15th Annual IEEE Conference on Computational Complexity.
[57] En-Hui Yang,et al. Grammar-based codes: A new class of universal lossless source codes , 2000, IEEE Trans. Inf. Theory.
[58] Norman Abramson,et al. Information theory and coding , 1963 .
[59] Shu Lin,et al. Error control coding : fundamentals and applications , 1983 .
[60] George B. Dantzig,et al. Linear programming and extensions , 1965 .
[61] Robert J. McEliece,et al. The Theory of Information and Coding , 1979 .
[62] Stephen B. Wicker,et al. Turbo Coding , 1998 .
[63] D. J. Wheeler,et al. A Block-sorting Lossless Data Compression Algorithm , 1994 .
[64] Raymond W. Yeung,et al. A simple upper bound on the redundancy of Huffman codes , 2002, IEEE Trans. Inf. Theory.
[65] Imre Csiszár,et al. The capacity of the arbitrarily varying channel revisited: Positivity, constraints , 1988, IEEE Trans. Inf. Theory.
[66] M. Lunelli,et al. Representation of matroids , 2002, math/0202294.
[67] Nikolai K. Vereshchagin,et al. Inequalities for Shannon Entropy and Kolmogorov Complexity , 1997, J. Comput. Syst. Sci..
[68] Marten van Dijk. On the information rate of perfect secret sharing schemes , 1995, Des. Codes Cryptogr..
[69] Toby Berger,et al. Information measures for discrete random fields , 1998 .
[70] Sergio Verdú,et al. The source-channel separation theorem revisited , 1995, IEEE Trans. Inf. Theory.
[71] Satoru Fujishige,et al. Polymatroidal Dependence Structure of a Set of Random Variables , 1978, Inf. Control..
[73] A. Barron. THE STRONG ERGODIC THEOREM FOR DENSITIES: GENERALIZED SHANNON-MCMILLAN-BREIMAN THEOREM' , 1985 .
[74] P. Shields. The Ergodic Theory of Discrete Sample Paths , 1996 .
[75] Toby Berger,et al. Multiterminal source encoding with encoder breakdown , 1989, IEEE Trans. Inf. Theory.
[76] I. Reed,et al. Polynomial Codes Over Certain Finite Fields , 1960 .
[77] Thomas M. Cover,et al. A Proof of the Data Compression Theorem of Slepian and Wolf for Ergodic Sources , 1971 .
[78] Alon Orlitsky,et al. Worst-case interactive communication I: Two messages are almost optimal , 1990, IEEE Trans. Inf. Theory.
[79] Andrew J. Viterbi,et al. Error bounds for convolutional codes and an asymptotically optimum decoding algorithm , 1967, IEEE Trans. Inf. Theory.
[80] Francesco M. Malvestuto,et al. Comment on "A unique formal system for binary decompositions of database relations, probability distributions, and graphs" , 1992, Inf. Sci..
[81] Frantisek Matús,et al. Conditional Independences among Four Random Variables II , 1995, Combinatorics, Probability and Computing.
[82] Claude E. Shannon,et al. Communication theory of secrecy systems , 1949, Bell Syst. Tech. J..
[83] David J. C. MacKay,et al. Good Error-Correcting Codes Based on Very Sparse Matrices , 1997, IEEE Trans. Inf. Theory.
[84] Yasutada Oohama. Gaussian multiterminal source coding , 1997, IEEE Trans. Inf. Theory.
[85] Gustavus J. Simmons,et al. Contemporary Cryptology: The Science of Information Integrity , 1994 .
[86] Seth Zimmerman. An Optimal Search Procedure , 1959 .
[87] Shu-Teh Chen Moy,et al. Generalizations of Shannon-McMillan theorem , 1961 .
[88] František Matúš,et al. Conditional Independences among Four Random Variables III: Final Conclusion , 1999, Combinatorics, probability & computing.
[89] R. Koetter,et al. An algebraic approach to network coding , 2001, Proceedings. 2001 IEEE International Symposium on Information Theory (IEEE Cat. No.01CH37252).
[90] Douglas R. Stinson,et al. An explication of secret sharing schemes , 1992, Des. Codes Cryptogr..
[91] Arak M. Mathai,et al. Basic Concepts in Information Theory and Statistics: Axiomatic Foundations and Applications , 1975 .
[92] Jack K. Wolf,et al. Noiseless coding of correlated information sources , 1973, IEEE Trans. Inf. Theory.
[93] John C. Kieffer,et al. A survey of the theory of source coding with a fidelity criterion , 1993, IEEE Trans. Inf. Theory.
[94] Aaron D. Wyner,et al. On source coding with side information at the decoder , 1975, IEEE Trans. Inf. Theory.
[95] Raymond W. Yeung. Local redundancy and progressive bounds on the redundancy of a Huffman code , 1991, IEEE Trans. Inf. Theory.
[96] J. A. Bondy,et al. Graph Theory with Applications , 1978 .
[97] T. Cover,et al. A sandwich proof of the Shannon-McMillan-Breiman theorem , 1988 .
[98] Fang-Wei Fu,et al. On the rate-distortion region for multiple descriptions , 2002, IEEE Trans. Inf. Theory.
[99] Zhen Zhang,et al. A non-Shannon-type conditional inequality of information quantities , 1997, IEEE Trans. Inf. Theory.
[100] Stanford Goldman,et al. Information theory , 1953 .
[101] Wojciech Szpankowski,et al. Asymptotic average redundancy of Huffman (and other) block codes , 2000, IEEE Trans. Inf. Theory.
[102] Robert G. Gallager,et al. Variations on a theme by Huffman , 1978, IEEE Trans. Inf. Theory.
[103] Sergio Verdú,et al. A general formula for channel capacity , 1994, IEEE Trans. Inf. Theory.
[104] F. Matús. PROBABILISTIC CONDITIONAL INDEPENDENCE STRUCTURES AND MATROID THEORY: BACKGROUND1 , 1993 .
[105] Raymond W. Yeung,et al. Information-theoretic characterizations of conditional mutual independence and Markov random fields , 2002, IEEE Trans. Inf. Theory.
[106] Jorma Rissanen,et al. Universal coding, information, prediction, and estimation , 1984, IEEE Trans. Inf. Theory.
[107] Gregory. J. Chaitin,et al. Algorithmic information theory , 1987, Cambridge tracts in theoretical computer science.
[108] John T. Pinkston. An application of rate-distortion theory to a converse to the coding theorem , 1969, IEEE Trans. Inf. Theory.
[109] Te Sun Han. An information-spectrum approach to source coding theorems with a fidelity criterion , 1997, IEEE Trans. Inf. Theory.
[110] K. Chung. A Note on the Ergodic Theorem of Information Theory , 1961 .
[111] Shlomo Shamai,et al. Systematic Lossy Source/Channel Coding , 1998, IEEE Trans. Inf. Theory.
[112] J. Massey. CAUSALITY, FEEDBACK AND DIRECTED INFORMATION , 1990 .
[113] Stephen B. Wicker,et al. Reed-Solomon Codes and Their Applications , 1999 .
[114] Katalin Marton,et al. Error exponent for source coding with a fidelity criterion , 1974, IEEE Trans. Inf. Theory.
[115] Imre Csiszár,et al. Arbitrarily varying channels with constrained inputs and states , 1988, IEEE Trans. Inf. Theory.
[116] Robert G. Gallager,et al. Low-density parity-check codes , 1962, IRE Trans. Inf. Theory.
[117] Lihua Song,et al. Zero-error network coding for acyclic network , 2003, IEEE Trans. Inf. Theory.
[118] Alfredo De Santis,et al. Tight Bounds on the Information Rate of Secret Sharing Schemes , 1997, Des. Codes Cryptogr..
[119] Jr. G. Forney,et al. The viterbi algorithm , 1973 .
[120] D. R. Fulkerson,et al. Flows in Networks. , 1964 .
[121] Rudolf Ahlswede,et al. Network information flow , 2000, IEEE Trans. Inf. Theory.
[122] Steven Roman,et al. Coding and information theory , 1992 .
[123] Gunter Dueck,et al. Reliability function of a discrete memoryless channel at rates above capacity (Corresp.) , 1979, IEEE Trans. Inf. Theory.
[124] Dake He,et al. Efficient universal lossless data compression algorithms based on a greedy sequential grammar transform .2. With context models , 2000, IEEE Trans. Inf. Theory.
[125] Shuo-Yen Robert Li,et al. Linear network coding , 2003, IEEE Trans. Inf. Theory.
[126] Gregory J. Chaitin,et al. Algorithmic Information Theory , 1987, IBM J. Res. Dev..
[127] Peter Elias,et al. Universal codeword sets and representations of the integers , 1975, IEEE Trans. Inf. Theory.
[128] Jacob Wolfowitz. Coding Theorems of Information Theory , 1962 .
[129] Raymond W. Yeung,et al. A new outlook of Shannon's information measures , 1991, IEEE Trans. Inf. Theory.
[130] S. Kullback,et al. Topics in statistical information theory , 1987 .
[131] Claude E. Shannon,et al. The zero error capacity of a noisy channel , 1956, IRE Trans. Inf. Theory.
[132] James L. Massey,et al. Shift-register synthesis and BCH decoding , 1969, IEEE Trans. Inf. Theory.
[133] Glen G. Langdon,et al. An Introduction to Arithmetic Coding , 1984, IBM J. Res. Dev..
[134] Amiel Feinstein,et al. Foundations of Information Theory , 1959 .
[135] G. David Forney,et al. Convolutional codes I: Algebraic structure , 1970, IEEE Trans. Inf. Theory.
[136] Claude E. Shannon,et al. Prediction and Entropy of Printed English , 1951 .
[137] C. E. SHANNON,et al. A mathematical theory of communication , 1948, MOCO.
[138] D. Rubin,et al. Maximum likelihood from incomplete data via the EM - algorithm plus discussions on the paper , 1977 .
[139] D. Blackwell,et al. The Capacities of Certain Channel Classes Under Random Coding , 1960 .
[140] Hu Kuo Ting,et al. On the Amount of Information , 1962 .
[141] Elwyn R. Berlekamp,et al. Key Papers in the Development of Coding Theory , 1974 .
[142] Douglas R. Stinson,et al. New General Lower Bounds on the Information Rate of Secret Sharing Schemes , 1992, CRYPTO.
[143] Jim K. Omura,et al. A coding theorem for discrete-time sources , 1973, IEEE Trans. Inf. Theory.
[144] Lee D. Davisson,et al. Universal noiseless coding , 1973, IEEE Trans. Inf. Theory.
[145] Alon Orlitsky,et al. Worst-case interactive communication - II: Two messages are not optimal , 1991, IEEE Trans. Inf. Theory.
[146] Aleksandr Yakovlevich Khinchin,et al. Mathematical foundations of information theory , 1959 .
[147] D. Ornstein. Bernoulli shifts with the same entropy are isomorphic , 1970 .
[148] Toby Berger,et al. Rate distortion theory : a mathematical basis for data compression , 1971 .
[149] Rudolf Ahlswede,et al. Source coding with side information and a converse for degraded broadcast channels , 1975, IEEE Trans. Inf. Theory.
[150] Solomon Kullback,et al. Information Theory and Statistics , 1960 .
[151] Andrei N. Kolmogorov,et al. On the Shannon theory of information transmission in the case of continuous signals , 1956, IRE Trans. Inf. Theory.
[152] Raymond W. Yeung,et al. Multi-way alternating minimization , 1995, Proceedings of 1995 IEEE International Symposium on Information Theory.
[153] J. Wolfowitz. The coding of messages subject to chance errors , 1957 .
[154] Raymond W. Yeung,et al. Multilevel diversity coding with distortion , 1995, IEEE Trans. Inf. Theory.
[155] Leon Gordon Kraft,et al. A device for quantizing, grouping, and coding amplitude-modulated pulses , 1949 .
[156] Elwyn R. Berlekamp,et al. Lower Bounds to Error Probability for Coding on Discrete Memoryless Channels. II , 1967, Inf. Control..
[157] B. McMillan. The Basic Theorems of Information Theory , 1953 .
[158] James M. Ooi. Coding for Channels with Feedback , 1998 .
[159] E. Jaynes. On the rationale of maximum-entropy methods , 1982, Proceedings of the IEEE.
[160] R. Gallager. Information Theory and Reliable Communication , 1968 .
[161] Rodney W. Johnson,et al. Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy , 1980, IEEE Trans. Inf. Theory.
[162] Jorma Rissanen,et al. Generalized Kraft Inequality and Arithmetic Coding , 1976, IBM J. Res. Dev..
[163] R. Blahut. Theory and practice of error control codes , 1983 .
[164] Y. Kakihara. Abstract Methods in Information Theory , 1999 .
[165] Richard E. Blahut. Information bounds of the Fano-Kullback type , 1976, IEEE Trans. Inf. Theory.
[166] Alfredo De Santis,et al. On the size of shares for secret sharing schemes , 1991, Journal of Cryptology.
[167] Te Sun Han,et al. A unified achievable rate region for a general class of multiterminal source coding systems , 1980, IEEE Trans. Inf. Theory.
[168] Joven Dj Golic. Noiseless Coding for Multiple Channels , 1994 .
[169] H. Chernoff. A Measure of Asymptotic Efficiency for Tests of a Hypothesis Based on the sum of Observations , 1952 .
[170] Te Sun Han,et al. Linear Dependence Structure of the Entropy Space , 1975, Inf. Control..
[171] Edward C. van der Meulen,et al. A survey of multi-way channels in information theory: 1961-1976 , 1977, IEEE Trans. Inf. Theory.
[172] L. Breiman. The Individual Ergodic Theorem of Information Theory , 1957 .
[173] Philippe Jacquet,et al. Entropy Computations via Analytic Depoissonization , 1999, IEEE Trans. Inf. Theory.
[174] Raymond W. Yeung,et al. Symmetrical multilevel diversity coding , 1997, IEEE Trans. Inf. Theory.
[175] David Slepian,et al. Key papers in the development of information theory , 1974 .
[176] Andrei N. Kolmogorov,et al. Logical basis for information theory and probability theory , 1968, IEEE Trans. Inf. Theory.
[177] Bruce Hajek,et al. A Decomposition Theorem for Binary Markov Random Fields , 1987 .
[178] Aaron D. Wyner,et al. Claude Elwood Shannon: Collected Papers , 1993 .
[179] J. Laurie Snell,et al. Markov Random Fields and Their Applications , 1980 .
[180] J. Berstel,et al. Theory of codes , 1985 .
[181] Rudolf Ahlswede,et al. Some properties of fix-free codes , 1996 .
[182] Sergio Verdú,et al. The role of the asymptotic equipartition property in noiseless source coding , 1997, IEEE Trans. Inf. Theory.
[183] William J. McGill. Multivariate information transmission , 1954, Trans. IRE Prof. Group Inf. Theory.
[184] Abraham Lempel,et al. A universal algorithm for sequential data compression , 1977, IEEE Trans. Inf. Theory.
[185] N. Sloane,et al. Obituary: Claude Shannon (1916–2001) , 2001, Nature.
[186] R. Gray. Entropy and Information Theory , 1990, Springer New York.
[187] van Me Marten Dijk. Secret key sharing and secret key generation , 1997 .
[188] Richard W. Hamming,et al. Error detecting and error correcting codes , 1950 .
[189] G. Jones,et al. Information and Coding Theory , 2000 .
[190] Terry A. Welch,et al. A Technique for High-Performance Data Compression , 1984, Computer.
[191] D. TER HAAR. J. A. Wheeler , 1973, Nature.
[192] Fazlollah M. Reza,et al. Introduction to Information Theory , 2004, Lecture Notes in Electrical Engineering.