Reliable Communication Under Channel Uncertainty
暂无分享,去创建一个
[1] Vladimir B. Balakirsky. Coding Theorem for Discrete Memoryless Channels with Given Decision Rule , 1991, Algebraic Coding.
[2] Jorma Rissanen,et al. The Minimum Description Length Principle in Coding and Modeling , 1998, IEEE Trans. Inf. Theory.
[3] R. Ahlswede. Certain results in coding theory for compound channels , 1967 .
[4] Imre Csiszár,et al. The capacity of the arbitrarily varying channel revisited: Positivity, constraints , 1988, IEEE Trans. Inf. Theory.
[5] Neri Merhav. How many information bits does a decoder need about the channel statistics? , 1997, IEEE Trans. Inf. Theory.
[6] Hans-Martin Wallmeier,et al. Random coding bound and codes produced by permutations for the multiple-access channel , 1985, IEEE Trans. Inf. Theory.
[7] J. Wolfowitz. Coding Theorems of Information Theory , 1962, Ergebnisse der Mathematik und Ihrer Grenzgebiete.
[8] R. McEliece,et al. Some Information Theoretic Saddlepoints , 1985 .
[9] Irvin G. Stiglitz,et al. Coding for a class of unknown channels , 1966, IEEE Trans. Inf. Theory.
[10] Sergio Verdú,et al. A general formula for channel capacity , 1994, IEEE Trans. Inf. Theory.
[11] Rudolf Ahlswede,et al. Common randomness in information theory and cryptography - I: Secret sharing , 1993, IEEE Trans. Inf. Theory.
[12] Axthonv G. Oettinger,et al. IEEE Transactions on Information Theory , 1998 .
[13] RUDOLF AHLSWEDE. Arbitrarily varying channels with states sequence known to the sender , 1986, IEEE Trans. Inf. Theory.
[14] Rudolf Ahlswede,et al. Localized random and arbitrary errors in the light of arbitrarily varying channel theory , 1995, IEEE Trans. Inf. Theory.
[15] Shlomo Shamai,et al. On information rates for mismatched decoders , 1994, IEEE Trans. Inf. Theory.
[16] Amos Lapidoth,et al. Nearest neighbor decoding for additive non-Gaussian noise channels , 1996, IEEE Trans. Inf. Theory.
[17] J. Ziv,et al. Universal sequential decoding , 1998, 1998 Information Theory Workshop (Cat. No.98EX131).
[18] Abbas El Gamal,et al. On the capacity of computer memory with defects , 1983, IEEE Trans. Inf. Theory.
[19] C. Shannon. Probability of error for optimal codes in a Gaussian channel , 1959 .
[20] Claude E. Shannon,et al. Channels with Side Information at the Transmitter , 1958, IBM J. Res. Dev..
[21] Toby Berger,et al. Lossy Source Coding , 1998, IEEE Trans. Inf. Theory.
[22] Toby Berger,et al. The CEO problem [multiterminal source coding] , 1996, IEEE Trans. Inf. Theory.
[23] Abraham Lempel,et al. Compression of individual sequences via variable-rate coding , 1978, IEEE Trans. Inf. Theory.
[24] Brian L. Hughes,et al. Nonconvexity of the capacity region of the multiple-access arbitrarily varying channel subject to constraints , 1995, IEEE Trans. Inf. Theory.
[25] Shlomo Shamai,et al. Information-theoretic considerations for symmetric, cellular, multiple-access fading channels - Part II , 1997, IEEE Trans. Inf. Theory.
[26] Aaas News,et al. Book Reviews , 1893, Buffalo Medical and Surgical Journal.
[27] Emre Telatar,et al. The Compound Channel Capacity of a Class of Finite-State Channels , 1998, IEEE Trans. Inf. Theory.
[28] Meir Feder,et al. Universal Decoding for Channels with Memory , 1998, IEEE Trans. Inf. Theory.
[29] Emre Telatar. Zero-error list capacities of discrete memoryless channels , 1997, IEEE Trans. Inf. Theory.
[30] J. Wolfowitz. Simultaneous channels , 1959 .
[31] E. Gilbert. Capacity of a burst-noise channel , 1960 .
[32] Sergio Verdú,et al. On channel capacity per unit cost , 1990, IEEE Trans. Inf. Theory.
[33] J. Wolfowitz. The coding of messages subject to chance errors , 1957 .
[34] Vladimir B. Balakirsky. A converse coding theorem for mismatched decoding at the output of binary-input memoryless channels , 1995, IEEE Trans. Inf. Theory.
[35] G. David Forney,et al. Exponential error bounds for erasure, list, and decision feedback schemes , 1968, IEEE Trans. Inf. Theory.
[36] Robert G. Gallager,et al. The random coding bound is tight for the average code (Corresp.) , 1973, IEEE Trans. Inf. Theory.
[37] Israel Bar-David,et al. Capacity and coding for the Gilbert-Elliot channels , 1989, IEEE Trans. Inf. Theory.
[38] Neri Merhav,et al. Universal Prediction , 1998, IEEE Trans. Inf. Theory.
[39] William L. Root,et al. Estimates of Epsilon capacity for certain linear communication channels , 1968, IEEE Trans. Inf. Theory.
[40] Peter Elias,et al. Zero error capacity under list decoding , 1988, IEEE Trans. Inf. Theory.
[41] László Lovász,et al. On the Shannon capacity of a graph , 1979, IEEE Trans. Inf. Theory.
[42] Neri Merhav. Universal decoding for memoryless Gaussian channels with a deterministic interference , 1993, IEEE Trans. Inf. Theory.
[43] Elwyn R. Berlekamp,et al. Lower Bounds to Error Probability for Coding on Discrete Memoryless Channels. II , 1967, Inf. Control..
[44] Prakash Narayan,et al. The capacity of a vector Gaussian arbitrarily varying channel , 1988, IEEE Trans. Inf. Theory.
[45] Aaron D. Wyner,et al. Shannon-theoretic approach to a Gaussian cellular multiple-access channel , 1994, IEEE Trans. Inf. Theory.
[46] R. Ahlswede. A Note on the Existence of the Weak Capacity for Channels with Arbitrarily Varying Channel Probability Functions and Its Relation to Shannon's Zero Error Capacity , 1970 .
[47] Peter Elias,et al. List decoding for noisy channels , 1957 .
[48] Imre Csiszár,et al. Arbitrarily varying channels with general alphabets and states , 1992, IEEE Trans. Inf. Theory.
[49] Imre Csisźar,et al. The Method of Types , 1998, IEEE Trans. Inf. Theory.
[50] John A. Gubner. State constraints for the multiple-access arbitrarily varying channel , 1991, IEEE Trans. Inf. Theory.
[51] Thomas H. E. Ericson,et al. Exponential error bounds for random codes in the arbitrarily varying channel , 1985, IEEE Trans. Inf. Theory.
[52] John A. Gubner. On the deterministic-code capacity of the multiple-access arbitrarily varying channel , 1990, IEEE Trans. Inf. Theory.
[53] Rudolf Ahlswede,et al. Correlated Decoding for Channels with Arbitrarily Varying Channel Probability Functions , 1969, Inf. Control..
[54] Theodore S. Rappaport,et al. Wireless communications - principles and practice , 1996 .
[55] Shlomo Shamai,et al. A broadcast strategy for the Gaussian slowly fading channel , 1997, Proceedings of IEEE International Symposium on Information Theory.
[56] R. Ahlswede. Elimination of correlation in random codes for arbitrarily varying channels , 1978 .
[57] D. Blackwell,et al. The Capacity of a Class of Channels , 1959 .
[58] Rudolf Ahlswede,et al. Multi-way communication channels , 1973 .
[59] Prakash Narayan,et al. Gaussian arbitrarily varying channels , 1987, IEEE Trans. Inf. Theory.
[60] Aaron D. Wyner,et al. On the Role of Pattern Matching in Information Theory , 1998, IEEE Trans. Inf. Theory.
[61] Johann-Heinrich Jahn,et al. Coding of arbitrarily varying multiuser channels , 1981, IEEE Trans. Inf. Theory.
[62] Michael L. Honig,et al. Bounds on s-rate for linear, time-invariant, multiinput/multioutput channels , 1990, IEEE Trans. Inf. Theory.
[63] Frederick Jelinek,et al. Indecomposable Channels with Side Information at the Transmitter , 1965, Inf. Control..
[64] Rudolf Ahlswede,et al. Common Randomness in Information Theory and Cryptography - Part II: CR Capacity , 1998, IEEE Trans. Inf. Theory.
[65] Shlomo Shamai,et al. Fading Channels: Information-Theoretic and Communication Aspects , 1998, IEEE Trans. Inf. Theory.
[66] Imre Csiszár,et al. Channel capacity for a given decoding metric , 1995, IEEE Trans. Inf. Theory.
[67] Shlomo Shamai,et al. Information theoretic considerations for cellular mobile radio , 1994 .
[68] D. Blackwell,et al. Proof of Shannon's Transmission Theorem for Finite-State Indecomposable Channels , 1958 .
[69] Rudolf Ahlswede,et al. Two proofs of Pinsker's conjecture concerning arbitrarily varying channels , 1991, IEEE Trans. Inf. Theory.
[70] Imre Csiszár,et al. Graph decomposition: A new key to coding theorems , 1981, IEEE Trans. Inf. Theory.
[71] J. Wolfowitz,et al. The capacity of a channel with arbitrarily varying channel probability functions and binary output alphabet , 1970 .
[72] Claude E. Shannon,et al. The zero error capacity of a noisy channel , 1956, IRE Trans. Inf. Theory.
[73] I. Csiszár,et al. On the capacity of the arbitrarily varying channel for maximum probability of error , 1981 .
[74] Imre Csiszár,et al. Capacity of the Gaussian arbitrarily varying channel , 1991, IEEE Trans. Inf. Theory.
[75] P. Varaiya,et al. Capacity of Classes of Gaussian Channels , 1968 .
[76] Rüdiger L. Urbanke,et al. A rate-splitting approach to the Gaussian multiple-access channel , 1996, IEEE Trans. Inf. Theory.
[77] Imre Csiszár. Generalized cutoff rates and Renyi's information measures , 1995, IEEE Trans. Inf. Theory.
[78] Rudolf Ahlswede. The maximal error capacity of arbitrarily varying channels for constant list sizes , 1993, IEEE Trans. Inf. Theory.
[79] D. A. Bell,et al. Information Theory and Reliable Communication , 1969 .
[80] A. Sridharan. Broadcast Channels , 2022 .
[81] Thomas M. Cover,et al. Elements of Information Theory , 2005 .
[82] Brian L. Hughes,et al. A new universal random coding bound for the multiple-access channel , 1996, IEEE Trans. Inf. Theory.
[83] Thomas H. E. Ericson. A min-max theorem for antijamming group codes , 1984, IEEE Trans. Inf. Theory.
[84] Ephraim Zehavi,et al. Decoding under integer metrics constraints , 1995, IEEE Trans. Commun..
[85] Nelson M. Blachman,et al. The effect of statistically dependent interference upon channel capacity , 1962, IRE Trans. Inf. Theory.
[86] N. Sloane,et al. Lower Bounds to Error Probability for Coding on Discrete Memoryless Channels. I , 1993 .
[87] A. Lapidoth. On the role of mismatch in rate distortion theory , 1995, Proceedings of 1995 IEEE International Symposium on Information Theory.
[88] Rudolf Ahlswede,et al. Coloring hypergraphs: A new approach to multi-user source coding, 1 , 1979 .
[89] Claude E. Shannon,et al. Certain Results in Coding Theory for Noisy Channels , 1957, Inf. Control..
[90] R. Gallager. ENERGY LIMITED CHANNELS : CODING , MULTIACCESS , AND SPREAD SPECTRUM * * ABSTRACT , 1987 .
[91] Volodia Blinovsky,et al. Estimation of the size of the list when decoding over an arbitrarily varying channel , 1993, Algebraic Coding.
[92] E. O. Elliott. Estimates of error rates for codes on burst-noise channels , 1963 .
[93] Rudolf Ahlswede,et al. Channel capacities for list codes , 1973, Journal of Applied Probability.
[94] Brian L. Hughes. The smallest list for the arbitrarily varying channel , 1997, IEEE Trans. Inf. Theory.
[95] N. Blachman,et al. On the capacity of a band-limited channel perturbed by statistically dependent interference , 1962, IRE Trans. Inf. Theory.
[96] G. G. Stokes. "J." , 1890, The New Yale Book of Quotations.
[97] Jacob Wolfowitz,et al. Multiple Access Channels , 1978 .
[98] Hong Shen Wang,et al. Finite-state Markov channel-a useful model for radio communication channels , 1995 .
[99] Imre Csiszár,et al. Capacity and decoding rules for classes of arbitrarily varying channels , 1989, IEEE Trans. Inf. Theory.
[100] Imre Csiszár,et al. Arbitrarily varying channels with constrained inputs and states , 1988, IEEE Trans. Inf. Theory.
[101] Wayne E. Stark,et al. On the capacity of channels with unknown interference , 1989, IEEE Trans. Inf. Theory.
[102] Rudolf Ahlswede,et al. Arbitrarily Varying Multiple-Access Channels - Part II - Correlated Senders' Side Information, Correlated Messages, and Ambiguous Transmission , 1997, IEEE Trans. Inf. Theory.
[103] Jacob Ziv,et al. Universal decoding for finite-state channels , 1985, IEEE Trans. Inf. Theory.
[104] John A. Gubner. On the capacity region of the discrete additive multiple-access arbitrarily varying channel , 1992, IEEE Trans. Inf. Theory.
[105] Joseph Y. N. Hui,et al. Fundamental issues of multiple accessing , 1983 .
[106] Max H. M. Costa,et al. Writing on dirty paper , 1983, IEEE Trans. Inf. Theory.
[107] Amos Lapidoth,et al. Mismatched decoding and the multiple-access channel , 1994, IEEE Trans. Inf. Theory.
[108] Amos Lapidoth,et al. On the Universality of the LZ-Based Decoding Algorithm , 1998, IEEE Trans. Inf. Theory.
[109] Rudolf Ahlswede,et al. Arbitrarily Varying Multiple-Access Channels Part I - Ericson's Symmetrizability Is Adequate, Gubner's Conjecture Is True , 1997, IEEE Trans. Inf. Theory.
[110] D. Blackwell,et al. The Capacities of Certain Channel Classes Under Random Coding , 1960 .
[111] A. Lapidoth. On the role of mismatch in rate distortion theory , 1997, IEEE Trans. Inf. Theory.
[112] Pravin Varaiya,et al. Capacity, mutual information, and coding for finite-state Markov channels , 1996, IEEE Trans. Inf. Theory.
[113] Rudolf Ahlswede,et al. Correlated sources help transmission over an arbitrarily varying channel , 1997, IEEE Trans. Inf. Theory.
[114] L. J. Forys,et al. The epsilon-Capacity of Classes of Unknown Channels , 1969, Inf. Control..
[115] Richard E. Blahut,et al. Principles and practice of information theory , 1987 .