Fifty Years of Shannon Theory
暂无分享,去创建一个
[1] A. J. Jerri. Correction to "The Shannon sampling theorem—Its various extensions and applications: A tutorial review" , 1979 .
[2] R. Redheffer,et al. Mathematics of Physics and Modern Engineering , 1960 .
[3] Imre Csiszár,et al. Capacity of the Gaussian arbitrarily varying channel , 1991, IEEE Trans. Inf. Theory.
[4] Toby Berger,et al. Rate distortion when side information may be absent , 1985, IEEE Trans. Inf. Theory.
[5] Jacob Ziv,et al. Some lower bounds on signal parameter estimation , 1969, IEEE Trans. Inf. Theory.
[6] P. Varaiya,et al. Capacity of Classes of Gaussian Channels , 1968 .
[7] László Lovász,et al. On the Shannon capacity of a graph , 1979, IEEE Trans. Inf. Theory.
[8] Rüdiger L. Urbanke,et al. A rate-splitting approach to the Gaussian multiple-access channel , 1996, IEEE Trans. Inf. Theory.
[9] B.M. Oliver,et al. The Philosophy of PCM , 1948, Proceedings of the IRE.
[10] William Equitz,et al. Successive refinement of information , 1991, IEEE Trans. Inf. Theory.
[11] Aaron D. Wyner,et al. The rate-distortion function for source coding with side information at the decoder , 1976, IEEE Trans. Inf. Theory.
[12] D. Huffman. A Method for the Construction of Minimum-Redundancy Codes , 1952 .
[13] A. D. Wyner,et al. The sliding-window Lempel-Ziv algorithm is asymptotically optimal , 1994, Proc. IEEE.
[14] Jack K. Wolf,et al. The capacity region of a multiple-access discrete memoryless channel can increase with feedback (Corresp.) , 1975, IEEE Trans. Inf. Theory.
[15] Suguru Arimoto,et al. An algorithm for computing the capacity of arbitrary discrete memoryless channels , 1972, IEEE Trans. Inf. Theory.
[16] Shlomo Shamai,et al. The empirical distribution of good codes , 1997, IEEE Trans. Inf. Theory.
[17] Jeffrey Scott Vitter,et al. Algorithm 673: Dynamic Huffman coding , 1989, TOMS.
[18] S. Muroga. On the Capacity of a Discrete Channel. I Mathematical expression of capacity of a channel which is disturbed by noise in its every one symbol and expressible in one state diagram , 1953 .
[19] Amir Dembo,et al. Large Deviations Techniques and Applications , 1998 .
[20] Claude E. Shannon,et al. Two-way Communication Channels , 1961 .
[21] Glen G. Langdon,et al. Arithmetic Coding , 1979 .
[22] P. M. Ebert,et al. The capacity of the Gaussian channel with feedback , 1970, Bell Syst. Tech. J..
[23] H. Nyquist,et al. Certain factors affecting telegraph speed , 1924, Journal of the A.I.E.E..
[24] Imre Csiszár. Generalized cutoff rates and Renyi's information measures , 1995, IEEE Trans. Inf. Theory.
[25] Rolf Landauer,et al. Information is Physical , 1991, Workshop on Physics and Computation.
[26] Sergio Verdú,et al. Gaussian multiaccess channels with ISI: Capacity region and multiuser water-filling , 1993, IEEE Trans. Inf. Theory.
[27] J. Nadal,et al. From statistical physics to statistical inference and back , 1994 .
[28] Serap A. Savari,et al. Generalized Tunstall codes for sources with memory , 1997, IEEE Trans. Inf. Theory.
[29] David L. Neuhoff,et al. Simplistic Universal Coding. , 1998, IEEE Trans. Inf. Theory.
[30] Shlomo Shamai,et al. On information rates for mismatched decoders , 1994, IEEE Trans. Inf. Theory.
[31] T. T. Kadota. Generalization of Feinstein's fundamental lemma (Corresp.) , 1970, IEEE Trans. Inf. Theory.
[32] Frans M. J. Willems,et al. The context-tree weighting method: basic properties , 1995, IEEE Trans. Inf. Theory.
[33] Alberto Leon-Garcia,et al. A source matching approach to finding minimax codes , 1980, IEEE Trans. Inf. Theory.
[34] J. Wolfowitz. The coding of messages subject to chance errors , 1957 .
[35] Sergio Verdú,et al. The source-channel separation theorem revisited , 1995, IEEE Trans. Inf. Theory.
[36] Leon Gordon Kraft,et al. A device for quantizing, grouping, and coding amplitude-modulated pulses , 1949 .
[37] Masoud Salehi,et al. Multiple access channels with arbitrarily correlated sources , 1980, IEEE Trans. Inf. Theory.
[38] David J. Sakrison,et al. The rate of a class of random processes , 1970, IEEE Trans. Inf. Theory.
[39] Brockway McMillan,et al. Two inequalities implied by unique decipherability , 1956, IRE Trans. Inf. Theory.
[40] Edward C. van der Meulen,et al. Some Reflections On The Interference Channel , 1994 .
[41] Tamás Linder,et al. On source coding with side-information-dependent distortion measures , 2000, IEEE Trans. Inf. Theory.
[42] Imre Csiszár,et al. Channel capacity for a given decoding metric , 1995, IEEE Trans. Inf. Theory.
[43] Rudolf Ahlswede,et al. Multi-way communication channels , 1973 .
[44] Paul H. Siegel,et al. Codes for Digital Recorders , 1998, IEEE Trans. Inf. Theory.
[45] E. Posner,et al. Epsilon Entropy and Data Compression , 1971 .
[46] David L. Neuhoff,et al. Fixed rate universal block source coding with a fidelity criterion , 1975, IEEE Trans. Inf. Theory.
[47] Gunter Dueck. The Capacity Region of the Two-Way Channel Can Exceed the Inner Bound , 1979, Inf. Control..
[48] R. McEliece,et al. Some Information Theoretic Saddlepoints , 1985 .
[49] Thomas M. Cover,et al. Non white Gaussian multiple access channels with feedback , 1994, IEEE Trans. Inf. Theory.
[50] Robert G. Gallager,et al. A simple derivation of the coding theorem and some applications , 1965, IEEE Trans. Inf. Theory.
[51] Zhen Zhang,et al. An on-line universal lossy data compression algorithm via continuous codebook refinement - Part I: Basic results , 1996, IEEE Trans. Inf. Theory.
[52] H. S. WITSENHAUSEN,et al. The zero-error side information problem and chromatic numbers (Corresp.) , 1976, IEEE Trans. Inf. Theory.
[53] C. Shannon. Probability of error for optimal codes in a Gaussian channel , 1959 .
[54] Toby Berger,et al. Lossy Source Coding , 1998, IEEE Trans. Inf. Theory.
[55] D. Ornstein,et al. The Shannon-McMillan-Breiman theorem for a class of amenable groups , 1983 .
[56] David A. Huffman,et al. A method for the construction of minimum-redundancy codes , 1952, Proceedings of the IRE.
[57] Emre Telatar,et al. The Compound Channel Capacity of a Class of Finite-State Channels , 1998, IEEE Trans. Inf. Theory.
[58] Martin Vetterli,et al. Data Compression and Harmonic Analysis , 1998, IEEE Trans. Inf. Theory.
[59] Yossef Steinberg,et al. An algorithm for source coding subject to a fidelity criterion, based on string matching , 1993, IEEE Trans. Inf. Theory.
[60] P. Hall,et al. On the estimation of entropy , 1993 .
[61] John R. Pierce,et al. The early days of information theory , 1973, IEEE Trans. Inf. Theory.
[62] Benjamin Weiss,et al. Entropy and data compression schemes , 1993, IEEE Trans. Inf. Theory.
[63] Andrew Chi-Chih Yao,et al. Some complexity questions related to distributive computing(Preliminary Report) , 1979, STOC.
[64] Toby Berger,et al. The CEO problem [multiterminal source coding] , 1996, IEEE Trans. Inf. Theory.
[65] A. Wyner. A note on the capacity of the band-limited Gaussian channel , 1966, The Bell System Technical Journal.
[66] Abraham Lempel,et al. Compression of individual sequences via variable-rate coding , 1978, IEEE Trans. Inf. Theory.
[67] Rafail Krichevsky. Universal Compression and Retrieval , 1994 .
[68] Vinay A. Vaishampayan,et al. Design of multiple description scalar quantizers , 1993, IEEE Trans. Inf. Theory.
[69] David S. Slepian,et al. Information theory in the fifties , 1973, IEEE Trans. Inf. Theory.
[70] D. A. Bell,et al. Information Theory and Reliable Communication , 1969 .
[71] R. Ahlswede. The Capacity Region of a Channel with Two Senders and Two Receivers , 1974 .
[72] Imre Csiszár,et al. Broadcast channels with confidential messages , 1978, IEEE Trans. Inf. Theory.
[73] Sergio Verdú,et al. A general formula for channel capacity , 1994, IEEE Trans. Inf. Theory.
[74] Toby Berger,et al. Multiterminal source encoding with one distortion criterion , 1989, IEEE Trans. Inf. Theory.
[75] Philippe Jacquet,et al. Asymptotic Behavior of the Lempel-Ziv Parsing Scheme and Digital Search Trees , 1995, Theor. Comput. Sci..
[76] Donald E. Knuth,et al. Dynamic Huffman Coding , 1985, J. Algorithms.
[77] Elwyn R. Berlekamp,et al. Lower Bounds to Error Probability for Coding on Discrete Memoryless Channels. II , 1967, Inf. Control..
[78] L. Goddard. Information Theory , 1962, Nature.
[79] Norbert Wiener,et al. Extrapolation, Interpolation, and Smoothing of Stationary Time Series , 1964 .
[80] Shun-ichi Amari,et al. Methods of information geometry , 2000 .
[81] Prakash Narayan,et al. Reliable Communication Under Channel Uncertainty , 1998, IEEE Trans. Inf. Theory.
[82] En-Hui Yang,et al. Sequential codes, lossless compression of individual sequences, and Kolmogorov complexity , 1996, IEEE Trans. Inf. Theory.
[83] Alon Orlitsky,et al. Average-case interactive communication , 1992, IEEE Trans. Inf. Theory.
[84] David Tse,et al. Multiaccess Fading Channels-Part I: Polymatroid Structure, Optimal Resource Allocation and Throughput Capacities , 1998, IEEE Trans. Inf. Theory.
[85] David L. Neuhoff,et al. Variable-to-fixed length codes provide better large deviations performance than fixed-to-variable length codes , 1992, IEEE Trans. Inf. Theory.
[86] Amiel Feinstein,et al. Information and information stability of random variables and processes , 1964 .
[87] Ian H. Witten,et al. Arithmetic coding for data compression , 1987, CACM.
[88] Amiel Feinstein. On the coding theorem and its converse for finite-memory channels , 1959 .
[89] Lawrence H. Ozarow,et al. The capacity of the white Gaussian multiple access channel with feedback , 1984, IEEE Trans. Inf. Theory.
[90] Ming Li,et al. An Introduction to Kolmogorov Complexity and Its Applications , 2019, Texts in Computer Science.
[91] Jacob Ziv,et al. Universal decoding for finite-state channels , 1985, IEEE Trans. Inf. Theory.
[92] Rudolf Ahlswede,et al. Coloring hypergraphs: A new approach to multi-user source coding, 1 , 1979 .
[93] Sergio Verdú,et al. Approximation theory of output statistics , 1993, IEEE Trans. Inf. Theory.
[94] Jack I. Karush. A simple proof of an inequality of McMillan (Corresp.) , 1961, IRE Trans. Inf. Theory.
[95] R. J. McEliece,et al. An improved upper bound on the block coding error exponent for binary input discrete memoryless channels , 1976 .
[96] Shlomo Shamai,et al. Capacity of channels with uncoded side information , 1995, Eur. Trans. Telecommun..
[97] V. Erokhin. $\varepsilon $-Entropy of a Discrete Random Variable , 1958 .
[98] Pierre A. Humblet,et al. The capacity region of the totally asynchronous multiple-access channel , 1985, IEEE Trans. Inf. Theory.
[99] Ralph Linsker,et al. Sensory Processing and Information Theory , 1994 .
[100] L. Devroye. A Course in Density Estimation , 1987 .
[101] Jorma Rissanen,et al. Universal coding, information, prediction, and estimation , 1984, IEEE Trans. Inf. Theory.
[102] P. Elias. The Efficient Construction of an Unbiased Random Sequence , 1972 .
[103] Gregory. J. Chaitin,et al. Algorithmic information theory , 1987, Cambridge tracts in theoretical computer science.
[104] Jacob Wolfowitz,et al. Channels with Arbitrarily Varying Channel Probability Functions , 1962, Inf. Control..
[105] Jacob Wolfowitz,et al. Memory Increases Capacity , 1967, Inf. Control..
[106] N. Sloane,et al. Lower Bounds to Error Probability for Coding on Discrete Memoryless Channels. I , 1993 .
[107] Toby Berger,et al. Fixed-slope universal lossy data compression , 1997, IEEE Trans. Inf. Theory.
[108] R. Ahlswede. Elimination of correlation in random codes for arbitrarily varying channels , 1978 .
[109] A. Kolmogorov. Three approaches to the quantitative definition of information , 1968 .
[110] Michelle Effros,et al. A vector quantization approach to universal noiseless coding and quantization , 1996, IEEE Trans. Inf. Theory.
[111] Meir Feder,et al. Universal Decoding for Channels with Memory , 1998, IEEE Trans. Inf. Theory.
[112] Cyril Leung,et al. An achievable rate region for the multiple-access channel with feedback , 1981, IEEE Trans. Inf. Theory.
[113] Amir Dembo,et al. Information theoretic inequalities , 1991, IEEE Trans. Inf. Theory.
[114] Thomas M. Cover,et al. A Proof of the Data Compression Theorem of Slepian and Wolf for Ergodic Sources , 1971 .
[115] Quentin F. Stout. Improved prefix encodings of the natural numbers (Corresp.) , 1980, IEEE Trans. Inf. Theory.
[116] Max H. M. Costa,et al. On the Gaussian interference channel , 1985, IEEE Trans. Inf. Theory.
[117] Sergio Verdú,et al. Simulation of random processes and rate-distortion theory , 1996, IEEE Trans. Inf. Theory.
[118] S. Rice. Mathematical analysis of random noise , 1944 .
[119] A. Robert Calderbank,et al. The Art of Signaling: Fifty Years of Coding Theory , 1998, IEEE Trans. Inf. Theory.
[120] I. Csiszár. Why least squares and maximum entropy? An axiomatic approach to inference for linear inverse problems , 1991 .
[121] Elwyn R. Berlekamp,et al. A lower bound to the distribution of computation for sequential decoding , 1967, IEEE Trans. Inf. Theory.
[122] Aydano B. Carleial,et al. A case where interference does not reduce capacity (Corresp.) , 1975, IEEE Trans. Inf. Theory.
[123] L. L. Campbell,et al. The relation between information theory and the differential geometry approach to statistics , 1985, Inf. Sci..
[124] D. Ornstein,et al. Universal Almost Sure Data Compression , 1990 .
[125] Moni Naor,et al. Three results on interactive communication , 1993, IEEE Trans. Inf. Theory.
[126] Toby Berger,et al. An upper bound on the rate distortion function for source coding with partial side information at the decoder , 1979, IEEE Trans. Inf. Theory.
[127] I. Csiszár. Generalized Cutoff Rates and Renyi's Information Measures , 1993, Proceedings. IEEE International Symposium on Information Theory.
[128] David Haussler,et al. A general minimax result for relative entropy , 1997, IEEE Trans. Inf. Theory.
[129] Wojciech Szpankowski,et al. Asymptotic properties of data compression and suffix trees , 1993, IEEE Trans. Inf. Theory.
[130] Robert M. Gray,et al. Information rates of autoregressive processes , 1970, IEEE Trans. Inf. Theory.
[131] Thomas M. Cover,et al. Comments on Broadcast Channels , 1998, IEEE Trans. Inf. Theory.
[132] Toby Berger,et al. Multiple description source coding with no excess marginal rate , 1995, IEEE Trans. Inf. Theory.
[133] Amiel Feinstein,et al. Error bounds in noisy channels without memory , 1955, IRE Trans. Inf. Theory.
[134] Zhen Zhang,et al. On the CEO problem , 1994, Proceedings of 1994 IEEE International Symposium on Information Theory.
[135] A. Sridharan. Broadcast Channels , 2022 .
[136] S. Goldman. Some Fundamental Considerations concerning Noise Reduction and Range in Radar and Communication , 1948, Proceedings of the IRE.
[137] Frans M. J. Willems,et al. Variable to fixed-length codes for Markov sources , 1987, IEEE Trans. Inf. Theory.
[138] Imre Csiszár,et al. Graph decomposition: A new key to coding theorems , 1981, IEEE Trans. Inf. Theory.
[139] Jack K. Wolf,et al. Noiseless coding of correlated information sources , 1973, IEEE Trans. Inf. Theory.
[140] Edgar N. Gilbert,et al. Codes based on inaccurate source probabilities , 1971, IEEE Trans. Inf. Theory.
[141] Bin Yu,et al. A rate of convergence result for a universal D-semifaithful code , 1993, IEEE Trans. Inf. Theory.
[142] Claude E. Shannon,et al. Certain Results in Coding Theory for Noisy Channels , 1957, Inf. Control..
[143] Thomas M. Cover,et al. Elements of Information Theory , 2005 .
[144] Frederick Jelinek,et al. On variable-length-to-block coding , 1972, IEEE Trans. Inf. Theory.
[145] Abraham Lempel,et al. On the Complexity of Finite Sequences , 1976, IEEE Trans. Inf. Theory.
[146] Thomas M. Cover,et al. Gaussian feedback capacity , 1989, IEEE Trans. Inf. Theory.
[147] Hsueh-Ming Hang,et al. Image and video coding standards , 1993, AT&T Technical Journal.
[148] Brian L. Hughes,et al. A new universal random coding bound for the multiple-access channel , 1996, IEEE Trans. Inf. Theory.
[149] Alon Orlitsky,et al. Zero-Error Information Theory , 1998, IEEE Trans. Inf. Theory.
[150] John C. Kieffer,et al. A survey of the theory of source coding with a fidelity criterion , 1993, IEEE Trans. Inf. Theory.
[151] Antonio Ortega,et al. Multiresolution broadcast for digital HDTV using joint source-channel coding , 1992, [Conference Record] SUPERCOMM/ICC '92 Discovering a New World of Communications.
[152] Aaron D. Wyner,et al. Some asymptotic properties of the entropy of a stationary ergodic data source with applications to data compression , 1989, IEEE Trans. Inf. Theory.
[153] Frans M. J. Willems,et al. A universal variable-to-fixed length source code based on Lawrence's algorithm , 1992, IEEE Trans. Inf. Theory.
[154] Sanjeev R. Kulkarni,et al. Source codes as random number generators , 1997, Proceedings of IEEE International Symposium on Information Theory.
[155] Brian Parker Tunstall,et al. Synthesis of noiseless compression codes , 1967 .
[156] J. Massey,et al. Communications and Cryptography: Two Sides of One Tapestry , 1994 .
[157] Robert J. McEliece,et al. An improved upper bound on the block coding error exponent for binary-input discrete memoryless channels (Corresp.) , 1977, IEEE Trans. Inf. Theory.
[158] J. M. Whittaker. The “Fourier” Theory of the Cardinal Function , 1928 .
[159] Hiroshi Sato,et al. The capacity of the Gaussian interference channel under strong interference , 1981, IEEE Trans. Inf. Theory.
[160] Nicolas Sourlas,et al. Statistical Mechanics and Error-Correcting Codes , 1990 .
[161] Lev B. Levitin,et al. Entropy of natural languages: Theory and experiment , 1994 .
[162] Erdal Arikan,et al. An upper bound on the cutoff rate of sequential decoding , 1988, IEEE Trans. Inf. Theory.
[163] Neri Merhav,et al. Universal Prediction , 1998, IEEE Trans. Inf. Theory.
[164] En-Hui Yang,et al. Simple universal lossy data compression schemes derived from the Lempel-Ziv algorithm , 1996, IEEE Trans. Inf. Theory.
[165] Abbas El Gamal,et al. Achievable rates for multiple descriptions , 1982, IEEE Trans. Inf. Theory.
[166] T. Cover,et al. A sandwich proof of the Shannon-McMillan-Breiman theorem , 1988 .
[167] S. Golomb. Run-length encodings. , 1966 .
[168] J. Wolfowitz. Simultaneous channels , 1959 .
[169] W. G. Tuller,et al. Theoretical Limitations on the Rate of Transmission of Information , 1949, Proceedings of the IRE.
[170] Andrei N. Kolmogorov,et al. Logical basis for information theory and probability theory , 1968, IEEE Trans. Inf. Theory.
[171] B. McMillan. The Basic Theorems of Information Theory , 1953 .
[172] Joel G. Smith,et al. The Information Capacity of Amplitude- and Variance-Constrained Scalar Gaussian Channels , 1971, Inf. Control..
[173] Bruce Hajek,et al. A Decomposition Theorem for Binary Markov Random Fields , 1987 .
[174] Claude E. Shannon,et al. General treatment of the problem of coding , 1953, Trans. IRE Prof. Group Inf. Theory.
[175] Shun-ichi Amari,et al. Statistical Inference Under Multiterminal Data Compression , 1998, IEEE Trans. Inf. Theory.
[176] Amos Lapidoth,et al. Mismatched decoding and the multiple-access channel , 1994, IEEE Trans. Inf. Theory.
[177] O. F. Cook. The Method of Types , 1898 .
[178] K. Marton. Bounding $\bar{d}$-distance by informational divergence: a method to prove measure concentration , 1996 .
[179] Toby Berger,et al. The quadratic Gaussian CEO problem , 1997, IEEE Trans. Inf. Theory.
[180] Alon Orlitsky. Interactive communication , 1996, Optics & Photonics.
[181] Robert M. Gray,et al. Ergodic and information theory , 1977 .
[182] Huaiyu Zhu. On Information and Sufficiency , 1997 .
[183] W. H. Paik,et al. The Grand Alliance system for US HDTV , 1995 .
[184] Toby Berger,et al. Information rates of Wiener processes , 1970, IEEE Trans. Inf. Theory.
[185] Shlomo Shamai,et al. Fading Channels: Information-Theoretic and Communication Aspects , 1998, IEEE Trans. Inf. Theory.
[186] David L. Neuhoff,et al. Quantization , 2022, IEEE Trans. Inf. Theory.
[187] Robert G. Gallager,et al. Variations on a theme by Huffman , 1978, IEEE Trans. Inf. Theory.
[188] Amos Lapidoth,et al. On the Universality of the LZ-Based Decoding Algorithm , 1998, IEEE Trans. Inf. Theory.
[189] Solomon W. Golomb,et al. Run-length encodings (Corresp.) , 1966, IEEE Trans. Inf. Theory.
[190] Ray J. Solomonoff,et al. A Formal Theory of Inductive Inference. Part I , 1964, Inf. Control..
[191] Andries P. Hekstra,et al. Dependence balance bounds for single-output two-way channels , 1989, IEEE Trans. Inf. Theory.
[192] A. J. Stam. Some Inequalities Satisfied by the Quantities of Information of Fisher and Shannon , 1959, Inf. Control..
[193] David L. Neuhoff,et al. Strong universal source coding subject to a rate-distortion constraint , 1982, IEEE Trans. Inf. Theory.
[194] Rudolf Ahlswede,et al. Universal coding of integers and unbounded search trees , 1997, IEEE Trans. Inf. Theory.
[195] Robert Price. A Conversation with Claude Shannon One Man's Approach to Problem Solving , 1985, Cryptologia.
[196] Robert G. Gallager,et al. A perspective on multiaccess channels , 1984, IEEE Trans. Inf. Theory.
[197] B.S. Choi,et al. An information-theoretic proof of Burg's maximum entropy spectrum , 1984, Proceedings of the IEEE.
[198] J. L. Holsinger,et al. DIGITAL COMMUNICATION OVER FIXED TIMECONTINUOUS CHANNELS WITH MEMORY, WITH SPECIAL APPLICATION TO TELEPHONE CHANNELS, , 1964 .
[199] Aaron D. Wyner,et al. Efficient Coding of a Binary Source with One Very Infrequent SymbolBell Laboratories Memorandum, Jan. 29, 1954. , 1993 .
[200] Andrea Sgarro,et al. Tunstall adaptive coding and miscoding , 1996, IEEE Trans. Inf. Theory.
[201] Sergio Verdu,et al. The exponential distribution in information theory , 1996 .
[202] J. Pieter M. Schalkwijk,et al. The binary multiplying channel--A coding scheme that operates beyond Shannon's inner bound region , 1982, IEEE Trans. Inf. Theory.
[203] Sergio Verdú,et al. The capacity region of the symbol-asynchronous Gaussian multiple-access channel , 1989, IEEE Trans. Inf. Theory.
[204] Gregory J. Chaitin,et al. On the Length of Programs for Computing Finite Binary Sequences , 1966, JACM.
[205] Mill Johannes G.A. Van,et al. Transmission Of Information , 1961 .
[206] Claude E. Shannon,et al. Communication theory of secrecy systems , 1949, Bell Syst. Tech. J..
[207] A. Wald. Note on the Consistency of the Maximum Likelihood Estimate , 1949 .
[208] Shlomo Shamai,et al. Systematic Lossy Source/Channel Coding , 1998, IEEE Trans. Inf. Theory.
[209] Max H. M. Costa,et al. The capacity region of the discrete memoryless interference channel with strong interference , 1987, IEEE Trans. Inf. Theory.
[210] S. Rice. Communication in the presence of noise — Probability of error for two encoding schemes , 1950 .
[211] Kung Yao,et al. Evaluation of rate-distortion functions for a class of independent identically distributed sources under an absolute-magnitude criterion , 1975, IEEE Trans. Inf. Theory.
[212] Aaron D. Wyner,et al. On the Role of Pattern Matching in Information Theory , 1998, IEEE Trans. Inf. Theory.
[213] C.E. Shannon,et al. Communication in the Presence of Noise , 1949, Proceedings of the IRE.
[214] Jacob Ziv,et al. Coding of sources with unknown statistics-II: Distortion relative to a fidelity criterion , 1972, IEEE Trans. Inf. Theory.
[215] Kung Yao,et al. Absolute error rate-distortion functions for sources with constrained magnitudes (Corresp.) , 1978, IEEE Trans. Inf. Theory.
[216] Sergio Verdú,et al. Sensitivity of channel capacity , 1995, IEEE Trans. Inf. Theory.
[217] Simon Litsyn,et al. New Upper Bounds on Error Exponents , 1999, IEEE Trans. Inf. Theory.
[218] Peter Elias,et al. Universal codeword sets and representations of the integers , 1975, IEEE Trans. Inf. Theory.
[219] Shu-Teh Chen Moy,et al. Generalizations of Shannon-McMillan theorem , 1961 .
[220] Hirosuke Yamamoto,et al. Information theory in cryptology , 1991 .
[221] J. Wolfowitz,et al. The capacity of a channel with arbitrarily varying channel probability functions and binary output alphabet , 1970 .
[222] R. Hunter,et al. International digital facsimile coding standards , 1980, Proceedings of the IEEE.
[223] Toby Berger,et al. New results in binary multiple descriptions , 1987, IEEE Trans. Inf. Theory.
[224] Solomon W. Golomb,et al. Probability, information theory, and prime number theory , 1992, Discrete Mathematics.
[225] R. Marks. Introduction to Shannon Sampling and Interpolation Theory , 1990 .
[226] Julia Abrahams,et al. Code and parse tree for lossless source encoding , 2001, Commun. Inf. Syst..
[227] Aaron D. Wyner,et al. Some Geometrical Results in Channel CapacityNachrichtentechnische Zeit, vol. 10. 1957. , 1993 .
[228] Frederick Jelinek,et al. Statistical methods for speech recognition , 1997 .
[229] Wojciech Szpankowski,et al. A suboptimal lossy data compression based on approximate pattern matching , 1997, IEEE Trans. Inf. Theory.
[230] A. Barron. ENTROPY AND THE CENTRAL LIMIT THEOREM , 1986 .
[231] B. Kowalski,et al. Theory of analytical chemistry , 1994 .
[232] Ray J. Solomonoff,et al. A Formal Theory of Inductive Inference. Part II , 1964, Inf. Control..
[233] A. Dembo. Information inequalities and concentration of measure , 1997 .
[234] Jacob Ziv,et al. Coding theorems for individual sequences , 1978, IEEE Trans. Inf. Theory.
[235] Rudolf Ahlswede,et al. The rate-distortion region for multiple descriptions without excess rate , 1985, IEEE Trans. Inf. Theory.
[236] O. Hijab,et al. Stabilization of control systems , 1986 .
[237] Bixio Rimoldi,et al. Successive refinement of information: characterization of the achievable rates , 1994, IEEE Trans. Inf. Theory.
[238] Noga Alon,et al. The Shannon Capacity of a Union , 1998, Comb..
[239] Jorma Rissanen,et al. The Minimum Description Length Principle in Coding and Modeling , 1998, IEEE Trans. Inf. Theory.
[240] Shlomo Shamai,et al. A broadcast strategy for the Gaussian slowly fading channel , 1997, Proceedings of IEEE International Symposium on Information Theory.
[241] Neri Merhav,et al. A strong version of the redundancy-capacity theorem of universal coding , 1995, IEEE Trans. Inf. Theory.
[242] Rory A. Fisher,et al. Probability likelihood and quantity of information in the logic of uncertain inference , 1934 .
[243] A. El Gamal,et al. Multiple user information theory , 1980, Proceedings of the IEEE.
[244] Zhen Zhang,et al. An on-line universal lossy data compression algorithm via continuous codebook refinement - Part II. Optimality for phi-mixing source models , 1996, IEEE Trans. Inf. Theory.
[245] Imre Csiszár,et al. The capacity of the arbitrarily varying channel revisited: Positivity, constraints , 1988, IEEE Trans. Inf. Theory.
[246] Hans-Martin Wallmeier,et al. Random coding bound and codes produced by permutations for the multiple-access channel , 1985, IEEE Trans. Inf. Theory.
[247] Hans S. Witsenhausen. Some aspects of convexity useful in information theory , 1980, IEEE Trans. Inf. Theory.
[248] Richard E. Blahut,et al. Computation of channel capacity and rate-distortion functions , 1972, IEEE Trans. Inf. Theory.
[249] Ram Zamir,et al. The rate loss in the Wyner-Ziv problem , 1996, IEEE Trans. Inf. Theory.
[250] Rudolf Ahlswede,et al. Common randomness in information theory and cryptography - I: Secret sharing , 1993, IEEE Trans. Inf. Theory.
[251] Amos Lapidoth,et al. Nearest neighbor decoding for additive non-Gaussian noise channels , 1996, IEEE Trans. Inf. Theory.
[252] Richard Clark Pasco,et al. Source coding algorithms for fast data compression , 1976 .
[253] S. Verdu,et al. Multiple-access channels with memory with and without frame synchronism , 1989, IEEE Trans. Inf. Theory.
[254] Yuri M. Suhov,et al. Nonparametric Entropy Estimation for Stationary Processesand Random Fields, with Applications to English Text , 1998, IEEE Trans. Inf. Theory.
[255] Peter Elias,et al. Zero error capacity under list decoding , 1988, IEEE Trans. Inf. Theory.
[256] S. Kullback,et al. Topics in statistical information theory , 1987 .
[257] Claude E. Shannon,et al. The zero error capacity of a noisy channel , 1956, IRE Trans. Inf. Theory.
[258] I. Csiszár,et al. On the capacity of the arbitrarily varying channel for maximum probability of error , 1981 .
[259] Claude E. Shannon,et al. Prediction and Entropy of Printed English , 1951 .
[260] Frans M. J. Willems,et al. Universal data compression and repetition times , 1989, IEEE Trans. Inf. Theory.
[261] H. Dudley. Thirty Years of Vocoder Research , 1964 .
[262] D. J. Wheeler,et al. A Block-sorting Lossless Data Compression Algorithm , 1994 .
[263] M. Sablatash. Transmission of all-digital advanced television: state of the art and future directions , 1994 .
[264] A. Calderbank,et al. Multilevel Codes for Unequal Error Protection , 1993, Proceedings. IEEE International Symposium on Information Theory.
[265] Sergio VerdÂ,et al. Modulation and Coding for Linear Gaussian Channels , 2000 .
[266] Jacob Ziv,et al. Variable-to-fixed length codes are better than fixed-to-variable length codes for Markov sources , 1990, IEEE Trans. Inf. Theory.
[267] A. Barron. THE STRONG ERGODIC THEOREM FOR DENSITIES: GENERALIZED SHANNON-MCMILLAN-BREIMAN THEOREM' , 1985 .
[268] Michelle Effros,et al. Variable-rate source coding theorems for stationary nonergodic sources , 1994, IEEE Trans. Inf. Theory.
[269] David L. Neuhoff,et al. Fixed-rate universal codes for Markov sources , 1978, IEEE Trans. Inf. Theory.
[270] G. David Forney,et al. Exponential error bounds for erasure, list, and decision feedback schemes , 1968, IEEE Trans. Inf. Theory.
[271] D. Blackwell,et al. The Capacity of a Class of Channels , 1959 .
[272] H. Nyquist,et al. Certain Topics in Telegraph Transmission Theory , 1928, Transactions of the American Institute of Electrical Engineers.
[273] Alon Orlitsky,et al. Worst-case interactive communication I: Two messages are almost optimal , 1990, IEEE Trans. Inf. Theory.
[274] Don E. Clegg. Information theory in analytical chemistry , 1995 .
[275] E. F. Moore,et al. Variable-length binary encodings , 1959 .
[276] Terry A. Welch,et al. A Technique for High-Performance Data Compression , 1984, Computer.
[277] A. Shiryayev. New Metric Invariant of Transitive Dynamical Systems and Automorphisms of Lebesgue Spaces , 1993 .
[278] Aaron D. Wyner,et al. Coding Theorems for a Discrete Source With a Fidelity CriterionInstitute of Radio Engineers, International Convention Record, vol. 7, 1959. , 1993 .
[279] Emre Telatar. Zero-error list capacities of discrete memoryless channels , 1997, IEEE Trans. Inf. Theory.
[280] Peter W. Shor,et al. Quantum Information Theory , 1998, IEEE Trans. Inf. Theory.
[281] J. Pieter M. Schalkwijk,et al. A coding scheme for additive noise channels with feedback-II: Band-limited signals , 1966, IEEE Trans. Inf. Theory.
[282] Amiel Feinstein,et al. On the coding theorem and its converse for finite-memory channels , 1959, Inf. Control..
[283] H. P. Yockey,et al. Information Theory And Molecular Biology , 1992 .
[284] D. Ornstein. Bernoulli shifts with the same entropy are isomorphic , 1970 .
[285] Fletcher Pratt,et al. Secret and Urgent. , 1939 .
[286] Andrei N. Kolmogorov,et al. On the Shannon theory of information transmission in the case of continuous signals , 1956, IRE Trans. Inf. Theory.
[287] Sergio Verdú,et al. On channel capacity per unit cost , 1990, IEEE Trans. Inf. Theory.
[288] P. Gács,et al. KOLMOGOROV'S CONTRIBUTIONS TO INFORMATION THEORY AND ALGORITHMIC COMPLEXITY , 1989 .
[289] J G Daugman,et al. Information Theory and Coding , 2005 .
[290] Peter Elias,et al. Interval and recency rank source coding: Two on-line adaptive variable-length schemes , 1987, IEEE Trans. Inf. Theory.
[291] Claude E. Shannon,et al. A symbolic analysis of relay and switching circuits , 1938, Transactions of the American Institute of Electrical Engineers.
[292] Sergio Verdú,et al. The role of the asymptotic equipartition property in noiseless source coding , 1997, IEEE Trans. Inf. Theory.
[293] Abraham Lempel,et al. A universal algorithm for sequential data compression , 1977, IEEE Trans. Inf. Theory.
[294] A. D. Santis,et al. Variations on a Theme by Gallager , 1992 .
[295] Jacob Ziv,et al. Distortion-rate theory for individual sequences , 1980, IEEE Trans. Inf. Theory.
[296] Imre Csiszár,et al. Capacity and decoding rules for classes of arbitrarily varying channels , 1989, IEEE Trans. Inf. Theory.
[297] J. Pieter M. Schalkwijk. On an extension of an achievable rate region for the binary multiplying channel , 1983, IEEE Trans. Inf. Theory.
[298] F. Jelinek. Evaluation of distortion rate functions for low distortions , 1967 .
[299] Sergio VerdÂ,et al. Fading Channels: InformationTheoretic and Communications Aspects , 2000 .
[300] J. Kieffer. A SIMPLE PROOF OF THE MOY-PEREZ GENERALIZATION OF THE SHANNON-MCMILLAN THEOREM , 1974 .
[301] Thomas M. Cover,et al. Enumerative source encoding , 1973, IEEE Trans. Inf. Theory.
[302] V. V. Prelov. Communication Channel Capacity with Almost Gaussian Noise , 1989 .
[303] R. G. Gallager,et al. Coding of Sources With Unknown Statistics- Part II: Distortion Relative to a Fidelity Criterion , 1972 .
[304] Gregory J. Chaitin,et al. Algorithmic Information Theory , 1987, IBM J. Res. Dev..
[305] Aaron D. Wyner,et al. The rate-distortion function for source coding with side information at the decoder , 1976, IEEE Trans. Inf. Theory.
[306] Anthony Ephremides,et al. Information Theory and Communication Networks: An Unconsummated Union , 1998, IEEE Trans. Inf. Theory.
[307] Sergio Verdú,et al. Bits through queues , 1994, Proceedings of 1994 IEEE International Symposium on Information Theory.
[308] Shunsuke Ihara,et al. Information theory - for continuous systems , 1993 .
[309] Thomas M. Cover,et al. A convergent gambling estimate of the entropy of English , 1978, IEEE Trans. Inf. Theory.
[310] Kenneth Rose,et al. A mapping approach to rate-distortion computation and analysis , 1994, IEEE Trans. Inf. Theory.
[311] Thomas J. Goblick,et al. Theoretical limitations on the transmission of data from analog sources , 1965, IEEE Trans. Inf. Theory.
[312] W. T. Grandy. Resource Letter ITP-1: Information Theory in Physics , 1997 .
[313] Richard E. Blahut,et al. Hypothesis testing and information theory , 1974, IEEE Trans. Inf. Theory.
[314] Jacob Ziv,et al. The behavior of analog communication systems , 1970, IEEE Trans. Inf. Theory.
[315] Y.S. Abu-Mostafa,et al. Information theory, complexity and neural networks , 1989, IEEE Communications Magazine.
[316] Peter Grassberger,et al. Estimating the information content of symbol sequences and efficient codes , 1989, IEEE Trans. Inf. Theory.
[317] Jorma Rissanen,et al. Generalized Kraft Inequality and Arithmetic Coding , 1976, IBM J. Res. Dev..
[318] Prakash Narayan,et al. Gaussian arbitrarily varying channels , 1987, IEEE Trans. Inf. Theory.
[319] C. Shannon,et al. An algebra for theoretical genetics , 1940 .
[320] Tamás Linder,et al. Rates of convergence in the source coding theorem, in empirical quantizer design, and in universal lossy source coding , 1994, IEEE Trans. Inf. Theory.
[321] Aaron D. Wyner,et al. Recent results in the Shannon theory , 1974, IEEE Trans. Inf. Theory.
[322] F. Ellersick,et al. A conversation with Claude Shannon , 1984, IEEE Communications Magazine.
[323] Raphail E. Krichevsky,et al. The performance of universal encoding , 1981, IEEE Trans. Inf. Theory.
[324] Sanjeev R. Kulkarni,et al. Learning Pattern Classification - A Survey , 1998, IEEE Trans. Inf. Theory.
[325] D. Slepian,et al. On bandwidth , 1976, Proceedings of the IEEE.
[326] U. Grenander,et al. Toeplitz Forms And Their Applications , 1958 .
[327] Paul C. Shields,et al. The Interactions Between Ergodic Theory and Information Theory , 1998, IEEE Trans. Inf. Theory.
[328] Shunsuke Ihara,et al. On the Capacity of Channels with Additive Non-Gaussian Noise , 1978, Inf. Control..
[329] Vladimir B. Balakirsky. Coding Theorem for Discrete Memoryless Channels with Given Decision Rule , 1991, Algebraic Coding.
[330] H. Pollak,et al. Prolate spheroidal wave functions, fourier analysis and uncertainty — III: The dimension of the space of essentially time- and band-limited signals , 1962 .
[331] Harvey S. Leff,et al. Maxwell's Demon 2 , 1990 .
[332] Grebogi,et al. Communicating with chaos. , 1993, Physical review letters.
[333] Vladimir M. Blinovsky,et al. List decoding , 1992, Discret. Math..
[334] I. Csiszár. Sanov Property, Generalized $I$-Projection and a Conditional Limit Theorem , 1984 .
[335] JORMA RISSANEN,et al. A universal data compression system , 1983, IEEE Trans. Inf. Theory.
[336] Rudolf Ahlswede,et al. Channel capacities for list codes , 1973, Journal of Applied Probability.
[337] Brian L. Hughes. The smallest list for the arbitrarily varying channel , 1997, IEEE Trans. Inf. Theory.
[338] L. Ozarow,et al. On a source-coding problem with two channels and three receivers , 1980, The Bell System Technical Journal.
[339] Shlomo Shamai,et al. The capacity of average and peak-power-limited quadrature Gaussian channels , 1995, IEEE Trans. Inf. Theory.
[340] T. Cover. Some Advances in Broadcast Channels , 1975 .
[341] Phil Clendeninn. The Vocoder , 1940, Nature.
[342] Andrew F. Rex,et al. Maxwell's Demon, Entropy, Information, Computing , 1990 .
[343] G. G. Stokes. "J." , 1890, The New Yale Book of Quotations.
[344] Amiel Feinstein,et al. A new basic theorem of information theory , 1954, Trans. IRE Prof. Group Inf. Theory.
[345] A. Barron. Approximation and Estimation Bounds for Artificial Neural Networks , 1991, COLT '91.
[346] Rudolf Ahlswede,et al. Identification via channels , 1989, IEEE Trans. Inf. Theory.
[347] Jacob Wolfowitz,et al. Multiple Access Channels , 1978 .
[348] Tamás Linder,et al. Fixed-rate universal lossy source coding and rates of convergence for memoryless sources , 1995, IEEE Trans. Inf. Theory.
[349] Robert E. Tarjan,et al. A Locally Adaptive Data , 1986 .
[350] James L. Massey,et al. Capacity of the discrete-time Gaussian channel with intersymbol interference , 1988, IEEE Trans. Inf. Theory.
[351] Yuhong Yang,et al. Information-theoretic determination of minimax rates of convergence , 1999 .
[352] Peter Elias,et al. List decoding for noisy channels , 1957 .
[353] Erik Ordentlich,et al. On the factor-of-two bound for Gaussian multiple-access channels with feedback , 1995, IEEE Trans. Inf. Theory.
[354] L. Breiman. The Individual Ergodic Theorem of Information Theory , 1957 .
[355] R. L. Dobrushin,et al. Asymptotic Estimates of the Probability of Error for Transmission of Messages over a Discrete Memoryless Communication Channel with a Symmetric Transition Probability Matrix , 1962 .
[356] Imre Csisźar,et al. The Method of Types , 1998, IEEE Trans. Inf. Theory.
[357] Peter Elias,et al. Error-correcting codes for list decoding , 1991, IEEE Trans. Inf. Theory.
[358] Toby Berger,et al. Rate-distortion for correlated sources with partially separated encoders , 1982, IEEE Trans. Inf. Theory.
[359] A. J. Jerri. The Shannon sampling theorem—Its various extensions and applications: A tutorial review , 1977, Proceedings of the IEEE.
[360] D. Slepian,et al. A coding theorem for multiple access channels with correlated sources , 1973 .
[361] I. Csiszár. $I$-Divergence Geometry of Probability Distributions and Minimization Problems , 1975 .
[362] D. Blackwell,et al. The Capacities of Certain Channel Classes Under Random Coding , 1960 .
[363] Robert M. Gray,et al. On the asymptotic eigenvalue distribution of Toeplitz matrices , 1972, IEEE Trans. Inf. Theory.
[364] Te Sun Han,et al. A new achievable rate region for the interference channel , 1981, IEEE Trans. Inf. Theory.
[365] A. D. Wyner,et al. The wire-tap channel , 1975, The Bell System Technical Journal.
[366] I. Vajda. Theory of statistical inference and information , 1989 .
[367] Joy A. Thomas,et al. Feedback can at most double Gaussian multiple access channel capacity , 1987, IEEE Trans. Inf. Theory.
[368] Lee D. Davisson,et al. Universal noiseless coding , 1973, IEEE Trans. Inf. Theory.
[369] Richard E. Blahut,et al. Principles and practice of information theory , 1987 .
[370] QUENTIN F. STOUT. Improved Prefix Encodings of the Natural Numbers , 1998 .
[371] Shlomo Shamai,et al. Worst-case power-constrained noise for binary-input channels , 1992, IEEE Trans. Inf. Theory.
[372] I. P. Tsaregradskii. A Note on the Capacity of a Stationary Channel with Finite Memory , 1958 .
[373] Edmund Taylor Whittaker. XVIII.—On the Functions which are represented by the Expansions of the Interpolation-Theory , 1915 .
[374] Toby Berger,et al. New outer bounds to capacity regions of two-way channels , 1986, IEEE Trans. Inf. Theory.
[375] B.Y. Ryabko,et al. A fast on-line adaptive code , 1992, IEEE Trans. Inf. Theory.
[376] Anand S. Bedekar,et al. The Information-Theoretic Capacity of Discrete-Time Queues , 1997, IEEE Trans. Inf. Theory.
[377] Thomas H. E. Ericson,et al. Exponential error bounds for random codes in the arbitrarily varying channel , 1985, IEEE Trans. Inf. Theory.