A View of Information-Estimation Relations in Gaussian Networks
暂无分享,去创建一个
Shlomo Shamai | H. Vincent Poor | Ronit Bustin | Alex Dytso | H. Poor | S. Shamai | Alex Dytso | R. Bustin
[1] Shlomo Shamai,et al. On MMSE properties and I-MMSE implications in parallel MIMO Gaussian channels , 2010, 2010 IEEE International Symposium on Information Theory.
[2] Hua Wang,et al. Gaussian Interference Channel Capacity to Within One Bit , 2007, IEEE Transactions on Information Theory.
[3] Aydin Sezgin,et al. Expanded GDoF-optimality Regime of Treating Interference as Noise in the $M\times 2$ X-Channel , 2017, IEEE Transactions on Information Theory.
[4] Shlomo Shamai,et al. Estimation in Gaussian Noise: Properties of the Minimum Mean-Square Error , 2010, IEEE Transactions on Information Theory.
[5] Sergio Verdú,et al. Derivative of Mutual Information at Zero SNR: The Gaussian-Noise Case , 2011, IEEE Transactions on Information Theory.
[6] A. Sridharan. Broadcast Channels , 2022 .
[7] Shlomo Shamai,et al. On the SNR-Evolution of the MMSE Function of Codes for the Gaussian Broadcast and Wiretap Channels , 2016, IEEE Transactions on Information Theory.
[8] Thomas M. Cover,et al. Elements of Information Theory , 2005 .
[9] Hiroshi Sato,et al. The capacity of the Gaussian interference channel under strong interference , 1981, IEEE Trans. Inf. Theory.
[10] Daniela Tuninetti,et al. On discrete alphabets for the two-user Gaussian interference channel with one receiver lacking knowledge of the interfering codebook , 2014, 2014 Information Theory and Applications Workshop (ITA).
[11] Ayfer Özgür,et al. Capacity of the Energy-Harvesting Channel With a Finite Battery , 2016, IEEE Transactions on Information Theory.
[12] Tyrone E. Duncan. Mutual Information for Stochastic Signals and Fractional Brownian Motion , 2008, IEEE Transactions on Information Theory.
[13] Tsachy Weissman,et al. Pointwise Relations Between Information and Estimation in Gaussian Noise , 2012, IEEE Transactions on Information Theory.
[14] Shlomo Shamai,et al. On MMSE properties of optimal codes for the Gaussian wiretap channel , 2015, 2015 IEEE Information Theory Workshop (ITW).
[15] S. Shamai,et al. On Extrinsic Information of Good Codes Operating Over Memoryless Channels with Incremental Noisiness , 2006, 2006 IEEE 24th Convention of Electrical & Electronics Engineers in Israel.
[16] Shlomo Shamai,et al. On MMSE Crossing Properties and Implications in Parallel Vector Gaussian Channels , 2013, IEEE Transactions on Information Theory.
[17] Shlomo Shamai,et al. Degraded broadcast channel: Secrecy outside of a bounded range , 2015, ITW.
[18] Imre Csiszár,et al. Information Theory - Coding Theorems for Discrete Memoryless Systems, Second Edition , 2011 .
[19] Thomas M. Cover,et al. Comments on Broadcast Channels , 1998, IEEE Trans. Inf. Theory.
[20] Aaron D. Wyner,et al. On the capacity of the Gaussian channel with a finite number of input levels , 1990, IEEE Trans. Inf. Theory.
[21] Shlomo Shamai,et al. The Interplay Between Information and Estimation Measures , 2013, Found. Trends Signal Process..
[22] Kenneth Rose,et al. On Conditions for Linearity of Optimal Estimation , 2010, IEEE Transactions on Information Theory.
[23] H. Vincent Poor,et al. Secrecy Capacity Region of a Multiple-Antenna Gaussian Broadcast Channel With Confidential Messages , 2007, IEEE Transactions on Information Theory.
[24] Shlomo Shamai,et al. On communications through a Gaussian noise channel with an MMSE disturbance constraint , 2016, 2016 Information Theory and Applications Workshop (ITA).
[25] Gerhard Kramer,et al. A New Outer Bound and the Noisy-Interference Sum–Rate Capacity for Gaussian Interference Channels , 2007, IEEE Transactions on Information Theory.
[26] Max H. M. Costa,et al. The capacity region of a class of deterministic interference channels , 1982, IEEE Trans. Inf. Theory.
[27] Tsachy Weissman,et al. Relations Between Information and Estimation in Discrete-Time Lévy Channels , 2014, IEEE Transactions on Information Theory.
[28] Aydano B. Carleial,et al. A case where interference does not reduce capacity (Corresp.) , 1975, IEEE Trans. Inf. Theory.
[29] Te Sun Han,et al. A new achievable rate region for the interference channel , 1981, IEEE Trans. Inf. Theory.
[30] A. J. Stam. Some Inequalities Satisfied by the Quantities of Information of Fisher and Shannon , 1959, Inf. Control..
[31] Max H. M. Costa,et al. The capacity region of the discrete memoryless interference channel with strong interference , 1987, IEEE Trans. Inf. Theory.
[32] Shlomo Shamai,et al. On MMSE properties of “good” and “bad” codes for the Gaussian broadcast channel , 2015, 2015 IEEE International Symposium on Information Theory (ISIT).
[33] A. D. Wyner,et al. The wire-tap channel , 1975, The Bell System Technical Journal.
[34] Loren W. Nolte,et al. Some geometric properties of the likelihood ratio (Corresp.) , 1971, IEEE Trans. Inf. Theory.
[35] Gal Chechik,et al. Information Bottleneck for Gaussian Variables , 2003, J. Mach. Learn. Res..
[36] Neri Merhav,et al. Analysis of Mismatched Estimation Errors Using Gradients of Partition Functions , 2013, IEEE Transactions on Information Theory.
[37] Shlomo Shamai,et al. Proof of Entropy Power Inequalities Via MMSE , 2006, 2006 IEEE International Symposium on Information Theory.
[38] Syed Ali Jafar,et al. On the Capacity and Generalized Degrees of Freedom of the X Channel , 2008, ArXiv.
[39] Martin E. Hellman,et al. The Gaussian wire-tap channel , 1978, IEEE Trans. Inf. Theory.
[40] Kamyar Moshksar,et al. An alternative to decoding interference or treating interference as Gaussian noise , 2011, 2011 IEEE International Symposium on Information Theory Proceedings.
[41] Mikael Skoglund,et al. Communication and interference coordination , 2014, 2014 Information Theory and Applications Workshop (ITA).
[42] Shlomo Shamai,et al. Comment on the Equality Condition for the I-MMSE Proof of Entropy Power Inequality , 2017, ArXiv.
[43] Shlomo Shamai,et al. An MMSE Approach to the Secrecy Capacity of the MIMO Gaussian Wiretap Channel , 2009, 2009 IEEE International Symposium on Information Theory.
[44] Shlomo Shamai,et al. The Capacity Region of the Gaussian Multiple-Input Multiple-Output Broadcast Channel , 2006, IEEE Transactions on Information Theory.
[45] T. Kailath. The innovations approach to detection and estimation theory , 1970 .
[46] Tsachy Weissman,et al. Mutual information, relative entropy, and estimation in the Poisson channel , 2010, 2011 IEEE International Symposium on Information Theory Proceedings.
[47] Shlomo Shamai,et al. The effect of maximal rate codes on the interfering message rate , 2014, 2014 IEEE International Symposium on Information Theory.
[48] G. David Forney,et al. On the role of MMSE estimation in approaching the information-theoretic limits of linear Gaussian channels: Shannon meets Wiener , 2004, ArXiv.
[49] Tyrone E. Duncan,et al. Evaluation of Likelihood Functions , 1968, Inf. Control..
[50] Sergio Verdú,et al. Mismatched Estimation and Relative Entropy , 2009, IEEE Transactions on Information Theory.
[51] Gottfried Ungerboeck,et al. Channel coding with multilevel/phase signals , 1982, IEEE Trans. Inf. Theory.
[52] Patrick P. Bergmans,et al. A simple converse for broadcast channels with additive white Gaussian noise (Corresp.) , 1974, IEEE Trans. Inf. Theory.
[53] Daniela Tuninetti,et al. Inner and Outer Bounds for the Gaussian Cognitive Interference Channel and New Capacity Results , 2010, IEEE Transactions on Information Theory.
[54] Shlomo Shamai,et al. A Note on the Secrecy Capacity of the Multiple-Antenna Wiretap Channel , 2007, IEEE Transactions on Information Theory.
[55] Amir K. Khandani,et al. Capacity bounds for the Gaussian Interference Channel , 2008, 2008 IEEE International Symposium on Information Theory.
[56] Lizhong Zheng,et al. A Coordinate System for Gaussian Networks , 2010, IEEE Transactions on Information Theory.
[57] Daniela Tuninetti,et al. Interference as Noise: Friend or Foe? , 2015, IEEE Transactions on Information Theory.
[58] Tyrone E. Duncan. Mutual Information for Stochastic Signals and LÉvy Processes , 2010, IEEE Transactions on Information Theory.
[59] Sergio Verdu,et al. MMSE Dimension , 2011, IEEE Trans. Inf. Theory.
[60] Syed Ali Jafar,et al. Interference Alignment and Degrees of Freedom of the $K$-User Interference Channel , 2008, IEEE Transactions on Information Theory.
[61] Shlomo Shamai,et al. A Vector Generalization of Costa's Entropy-Power Inequality With Applications , 2009, IEEE Transactions on Information Theory.
[62] Dongning Guo,et al. On information-Estimation relationships over binomial and negative binomial models , 2013, 2013 IEEE International Symposium on Information Theory.
[63] Matthieu R. Bloch,et al. Physical-Layer Security: From Information Theory to Security Engineering , 2011 .
[64] R. Esposito,et al. On a Relation between Detection and Estimation in Decision Theory , 1968, Inf. Control..
[65] Tie Liu,et al. An Extremal Inequality Motivated by Multiterminal Information-Theoretic Problems , 2006, IEEE Transactions on Information Theory.
[66] Shlomo Shamai,et al. MMSE of “Bad” Codes , 2013, IEEE Transactions on Information Theory.
[67] Shlomo Shamai,et al. On the equality condition for the I-MMSE proof of the entropy power inequality , 2017, 2017 55th Annual Allerton Conference on Communication, Control, and Computing (Allerton).
[68] Shlomo Shamai,et al. Upper and Lower Bounds on the Capacity of Amplitude-Constrained MIMO Channels , 2017, GLOBECOM 2017 - 2017 IEEE Global Communications Conference.
[69] Robert G. Gallager,et al. Capacity and coding for degraded broadcast channels , 1974 .
[70] Syed A. Jafar,et al. Interference Alignment: A New Look at Signal Dimensions in a Communication Network , 2011, Found. Trends Commun. Inf. Theory.
[71] Venugopal V. Veeravalli,et al. Gaussian Interference Networks: Sum Capacity in the Low-Interference Regime and New Outer Bounds on the Capacity Region , 2008, IEEE Transactions on Information Theory.
[72] Shlomo Shamai,et al. Mutual information and minimum mean-square error in Gaussian channels , 2004, IEEE Transactions on Information Theory.
[73] Sergio Verdú,et al. Functional Properties of Minimum Mean-Square Error and Mutual Information , 2012, IEEE Transactions on Information Theory.
[74] Sergio Verdú,et al. On limiting characterizations of memoryless multiuser capacity regions , 1993, IEEE Trans. Inf. Theory.
[75] Antonia Maria Tulino,et al. Monotonic Decrease of the Non-Gaussianness of the Sum of Independent Random Variables: A Simple Proof , 2006, IEEE Transactions on Information Theory.
[76] Sergio Verdú,et al. Optimal Phase Transitions in Compressed Sensing , 2011, IEEE Transactions on Information Theory.
[77] Sang Joon Kim,et al. A Mathematical Theory of Communication , 2006 .
[78] Shlomo Shamai,et al. Information Dimension and the Degrees of Freedom of the Interference Channel , 2015, IEEE Transactions on Information Theory.
[79] Shlomo Shamai,et al. An I-MMSE based graphical representation of rate and equivocation for the Gaussian broadcast channel , 2015, 2015 IEEE Conference on Communications and Network Security (CNS).
[80] Matthieu R. Bloch,et al. Wireless Information-Theoretic Security , 2008, IEEE Transactions on Information Theory.
[81] Seymour Sherman,et al. Non-mean-square error criteria , 1958, IRE Trans. Inf. Theory.
[82] Moshe Zakai,et al. On mutual information, likelihood ratios, and estimation error for the additive Gaussian channel , 2004, IEEE Transactions on Information Theory.
[83] Dongning Guo,et al. Relative entropy and score function: New information-estimation relationships through arbitrary additive perturbation , 2009, 2009 IEEE International Symposium on Information Theory.
[84] Shlomo Shamai,et al. A generalized Ozarow-Wyner capacity bound with applications , 2017, 2017 IEEE International Symposium on Information Theory (ISIT).
[85] Tobias Koch,et al. High-SNR Asymptotics of Mutual Information for Discrete Constellations With Applications to BICM , 2014, IEEE Transactions on Information Theory.
[86] Shlomo Shamai,et al. Statistical Physics of Signal Estimation in Gaussian Noise: Theory and Examples of Phase Transitions , 2008, IEEE Transactions on Information Theory.
[87] Chandra Nair,et al. Sub-optimality of Han-Kobayashi achievable region for interference channels , 2015, 2015 IEEE International Symposium on Information Theory (ISIT).
[88] Shlomo Shamai,et al. Additive non-Gaussian noise channels: mutual information and conditional mean estimation , 2005, Proceedings. International Symposium on Information Theory, 2005. ISIT 2005..
[89] Jacob Ziv,et al. Mutual information of the white Gaussian channel with and without feedback , 1971, IEEE Trans. Inf. Theory.
[90] Shlomo Shamai,et al. On additive channels with generalized Gaussian noise , 2017, 2017 IEEE International Symposium on Information Theory (ISIT).
[91] Imre Csiszár,et al. Broadcast channels with confidential messages , 1978, IEEE Trans. Inf. Theory.
[92] Daniela Tuninetti,et al. On the Two-User Interference Channel With Lack of Knowledge of the Interference Codebook at One Receiver , 2015, IEEE Transactions on Information Theory.
[93] Shlomo Shamai,et al. On the applications of the minimum mean p-th error (MMPE) to information theoretic quantities , 2016, 2016 IEEE Information Theory Workshop (ITW).
[94] It's Easier to Approximate , 2010 .
[95] Abbas El Gamal,et al. Communication With Disturbance Constraints , 2014, IEEE Transactions on Information Theory.
[96] Daniel Pérez Palomar,et al. Gradient of mutual information in linear vector Gaussian channels , 2006, IEEE Transactions on Information Theory.
[97] S. Verdú,et al. The impact of constellation cardinality on Gaussian channel capacity , 2010, 2010 48th Annual Allerton Conference on Communication, Control, and Computing (Allerton).
[98] Haim H. Permuter,et al. Directed Information, Causal Estimation, and Communication in Continuous Time , 2009, IEEE Transactions on Information Theory.
[99] Ayfer Özgür,et al. Near Optimal Energy Control and Approximate Capacity of Energy Harvesting Communication , 2014, IEEE Journal on Selected Areas in Communications.
[100] Sergio Verdú,et al. A simple proof of the entropy-power inequality , 2006, IEEE Transactions on Information Theory.
[101] Thomas M. Cover,et al. Network Information Theory , 2001 .
[102] Emre Telatar,et al. Bounds on the capacity region of a class of interference channels , 2007, 2007 IEEE International Symposium on Information Theory.
[103] T. Duncan. ON THE CALCULATION OF MUTUAL INFORMATION , 1970 .
[104] Miguel R. D. Rodrigues,et al. MIMO Gaussian Channels With Arbitrary Inputs: Optimal Precoding and Power Allocation , 2010, IEEE Transactions on Information Theory.
[105] Jian Song,et al. Extensions of the I-MMSE Relationship to Gaussian Channels With Feedback and Memory , 2014, IEEE Transactions on Information Theory.
[106] Rudolf Ahlswede,et al. Multi-way communication channels , 1973 .
[107] A. Robert Calderbank,et al. Soft-Decoding-Based Strategies for Relay and Interference Channels: Analysis and Achievable Rates Using LDPC Codes , 2010, IEEE Transactions on Information Theory.
[108] Shlomo Shamai,et al. Mutual Information and Conditional Mean Estimation in Poisson Channels , 2004, IEEE Transactions on Information Theory.
[109] David Tse,et al. The two-user Gaussian interference channel: a deterministic view , 2008, Eur. Trans. Telecommun..