Strong Data Processing Inequalities for Input Constrained Additive Noise Channels
暂无分享,去创建一个
[1] Feller William,et al. An Introduction To Probability Theory And Its Applications , 1950 .
[2] R. Dobrushin. Central Limit Theorem for Nonstationary Markov Chains. II , 1956 .
[3] P. A. P. Moran,et al. An introduction to probability theory , 1968 .
[4] R. Gallager. Information Theory and Reliable Communication , 1968 .
[5] D. A. Bell,et al. Information Theory and Reliable Communication , 1969 .
[6] Aaron D. Wyner,et al. A theorem on the entropy of certain binary sequences and applications-I , 1973, IEEE Trans. Inf. Theory.
[7] Aaron D. Wyner,et al. A theorem on the entropy of certain binary sequences and applications-II , 1973, IEEE Trans. Inf. Theory.
[8] Hans S. Witsenhausen,et al. Entropy inequalities for discrete channels , 1974, IEEE Trans. Inf. Theory.
[9] J. Kemperman. On the Shannon capacity of an arbitrary channel , 1974 .
[10] Hans S. Witsenhausen,et al. A conditional entropy bound for a pair of discrete random variables , 1975, IEEE Trans. Inf. Theory.
[11] P. Gács,et al. Spreading of Sets in Product Spaces and Hypercontraction of the Markov Operator , 1976 .
[12] Thomas M. Cover,et al. Elements of Information Theory , 2005 .
[13] Joel E. Cohen,et al. Relative entropy under mappings by stochastic matrices , 1993 .
[14] V. V. Petrov. Limit Theorems of Probability Theory: Sequences of Independent Random Variables , 1995 .
[15] M. Talagrand. Transportation cost for Gaussian and other product measures , 1996 .
[16] O. Kallenberg. Foundations of Modern Probability , 2021, Probability Theory and Stochastic Modelling.
[17] Charles M. Grinstead,et al. Introduction to probability , 1999, Statistics for the Behavioural Sciences.
[18] Thomas M. Cover,et al. Network Information Theory , 2001 .
[19] Shlomo Shamai,et al. Mutual information and minimum mean-square error in Gaussian channels , 2004, IEEE Transactions on Information Theory.
[20] Thomas M. Cover,et al. Elements of information theory (2. ed.) , 2006 .
[21] L. Dworsky. An Introduction to Probability , 2008 .
[22] Tsachy Weissman,et al. The Information Lost in Erasures , 2008, IEEE Transactions on Information Theory.
[23] S. Verdú,et al. The impact of constellation cardinality on Gaussian channel capacity , 2010, 2010 48th Annual Allerton Conference on Communication, Control, and Computing (Allerton).
[24] Imre Csiszár,et al. Information Theory - Coding Theorems for Discrete Memoryless Systems, Second Edition , 2011 .
[25] Igor Vajda,et al. On Pairs of $f$ -Divergences and Their Joint Range , 2010, IEEE Transactions on Information Theory.
[26] Shlomo Shamai,et al. Estimation in Gaussian Noise: Properties of the Minimum Mean-Square Error , 2010, IEEE Transactions on Information Theory.
[27] Venkat Anantharam,et al. On Maximal Correlation, Hypercontractivity, and the Data Processing Inequality studied by Erkip and Cover , 2013, ArXiv.
[28] Igal Sason,et al. Concentration of Measure Inequalities in Information Theory, Communications, and Coding , 2012, Found. Trends Commun. Inf. Theory.
[29] Maxim Raginsky,et al. Strong Data Processing Inequalities and $\Phi $ -Sobolev Inequalities for Discrete Channels , 2014, IEEE Transactions on Information Theory.
[30] Yihong Wu,et al. Dissipation of Information in Channels With Input Constraints , 2014, IEEE Transactions on Information Theory.