暂无分享,去创建一个
Michael Gastpar | Chen Feng | Bobak Nazer | Sung Hoon Lim | Adriano Pastore | M. Gastpar | A. Pastore | B. Nazer | S. Lim | Chen Feng
[1] Sae-Young Chung,et al. Capacity of the Gaussian Two-way Relay Channel to within 1/2 Bit , 2009, ArXiv.
[2] Peter Harremoës,et al. Rényi Divergence and Kullback-Leibler Divergence , 2012, IEEE Transactions on Information Theory.
[3] Claude E. Shannon,et al. Channels with Side Information at the Transmitter , 1958, IBM J. Res. Dev..
[4] Uri Erez,et al. The Approximate Sum Capacity of the Symmetric Gaussian $K$ -User Interference Channel , 2014, IEEE Trans. Inf. Theory.
[5] Urs Niesen,et al. The Degrees of Freedom of Compute-and-Forward , 2011, IEEE Transactions on Information Theory.
[6] Michael Gastpar,et al. Asymmetric Compute-and-Forward with CSIT , 2014, ArXiv.
[7] Ina Fourie,et al. Entropy and Information Theory (2nd ed.) , 2012 .
[8] Uri Erez,et al. Successive integer-forcing and its sum-rate optimality , 2013, 2013 51st Annual Allerton Conference on Communication, Control, and Computing (Allerton).
[9] Edward C. Posner,et al. Random coding strategies for minimum entropy , 1975, IEEE Trans. Inf. Theory.
[10] A. Rényi. On the dimension and entropy of probability distributions , 1959 .
[11] S. Sandeep Pradhan,et al. Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function , 2007, IEEE Transactions on Information Theory.
[12] Meir Feder,et al. The Random Coding Bound Is Tight for the Average Linear Code or Lattice , 2013, IEEE Transactions on Information Theory.
[13] Uri Erez,et al. Lattice Strategies for the Dirty Multiple Access Channel , 2007, IEEE Transactions on Information Theory.
[14] Abhay Parekh,et al. The Approximate Capacity of the Many-to-One and One-to-Many Gaussian Interference Channels , 2008, IEEE Transactions on Information Theory.
[15] Shashank Vatedka,et al. Secure Compute-and-Forward in a Bidirectional Relay , 2012, IEEE Transactions on Information Theory.
[16] Michael Gastpar,et al. Maximum Throughput Gain of Compute-and-Forward for Multiple Unicast , 2014, IEEE Communications Letters.
[17] Aaron B. Wagner. On Distributed Compression of Linear Functions , 2011, IEEE Transactions on Information Theory.
[18] Giuseppe Caire,et al. Compute-and-Forward Strategies for Cooperative Distributed Antenna Systems , 2012, IEEE Transactions on Information Theory.
[19] Arun Padakandla,et al. An Achievable Rate Region for the Three-User Interference Channel Based on Coset Codes , 2014, IEEE Transactions on Information Theory.
[20] Michael Gastpar,et al. Multiple Access via Compute-and-Forward , 2014, ArXiv.
[21] Uri Erez,et al. On the Robustness of Lattice Interference Alignment , 2013, IEEE Transactions on Information Theory.
[22] Alexander Sprintson,et al. Joint Physical Layer Coding and Network Coding for Bidirectional Relaying , 2008, IEEE Transactions on Information Theory.
[23] Yang Yang,et al. Distributed Compression of Linear Functions: Partial Sum-Rate Tightness and Gap to Optimal Sum-Rate , 2014, IEEE Trans. Inf. Theory.
[24] Sung Hoon Lim,et al. A Unified Approach to Hybrid Coding , 2015, IEEE Transactions on Information Theory.
[25] Yuval Kochman,et al. Lattice Coding for Signals and Networks: References , 2014 .
[26] Arun Padakandla,et al. Achievable rate region for three user discrete broadcast channel based on coset codes , 2012, 2013 IEEE International Symposium on Information Theory.
[27] Shigeki Miyake. Coding Theorems for Point-to-Point Communication Systems using Sparse Matrix Codes , 2010 .
[28] Amir K. Khandani,et al. Real Interference Alignment: Exploiting the Potential of Single Antenna Systems , 2009, IEEE Transactions on Information Theory.
[29] Thomas M. Cover,et al. Network Information Theory , 2001 .
[30] Sang Joon Kim,et al. A Mathematical Theory of Communication , 2006 .
[31] Giuseppe Caire,et al. Integer-forcing interference alignment , 2013, 2013 IEEE International Symposium on Information Theory.
[32] János Körner,et al. How to encode the modulo-two sum of binary sources (Corresp.) , 1979, IEEE Trans. Inf. Theory.
[33] Ashok Vardhan Makkuva,et al. On additive-combinatorial affine inequalities for Shannon entropy and differential entropy , 2016, 2016 IEEE International Symposium on Information Theory (ISIT).
[34] Michael Gastpar,et al. Compute-and-Forward: Harnessing Interference Through Structured Codes , 2009, IEEE Transactions on Information Theory.
[35] Arun Padakandla,et al. Computing sum of sources over an arbitrary multiple access channel , 2013, 2013 IEEE International Symposium on Information Theory.
[36] David Tse,et al. Interference neutralization in distributed lossy source coding , 2010, 2010 IEEE International Symposium on Information Theory.
[37] Imre Csiszár,et al. Information Theory - Coding Theorems for Discrete Memoryless Systems, Second Edition , 2011 .
[38] Michael Gastpar,et al. Integer-forcing linear receivers , 2010, 2010 IEEE International Symposium on Information Theory.
[39] Ram Zamir,et al. On the Loss of Single-Letter Characterization: The Dirty Multiple Access Channel , 2009, IEEE Transactions on Information Theory.
[40] Frank R. Kschischang,et al. An Algebraic Approach to Physical-Layer Network Coding , 2010, IEEE Transactions on Information Theory.
[41] I-Hsiang Wang. Approximate Capacity of the Dirty Multiple-Access Channel With Partial State Information at the Encoders , 2012, IEEE Transactions on Information Theory.
[42] Natasha Devroye,et al. Lattice Codes for the Gaussian Relay Channel: Decode-and-Forward and Compress-and-Forward , 2011, IEEE Transactions on Information Theory.
[43] Sae-Young Chung,et al. Capacity of the Gaussian Two-Way Relay Channel to Within ${1\over 2}$ Bit , 2009, IEEE Transactions on Information Theory.
[44] Giuseppe Caire,et al. Expanding the Compute-and-Forward Framework: Unequal Powers, Signal Levels, and Multiple Linear Combinations , 2015, IEEE Transactions on Information Theory.
[45] Urs Niesen,et al. Interference alignment: From degrees-of-freedom to constant-gap capacity approximations , 2011, 2012 IEEE International Symposium on Information Theory Proceedings.
[46] R. Gray. Entropy and Information Theory , 1990, Springer New York.
[47] Aylin Yener,et al. Providing Secrecy With Structured Codes: Tools and Applications to Two-User Gaussian Channels , 2009, ArXiv.
[48] Ilan Shomorony,et al. Degrees of Freedom of Two-Hop Wireless Networks: Everyone Gets the Entire Cake , 2012, IEEE Transactions on Information Theory.
[49] R. Ahlswede. Group Codes do not Achieve Shannon's Channel Capacity for General Discrete Channels , 1971 .
[50] Robert M. Gray,et al. Entropy and Information Theory -2/E. , 2014 .
[51] Abbas El Gamal,et al. Capacity theorems for the relay channel , 1979, IEEE Trans. Inf. Theory.
[52] Michael Gastpar,et al. Compute-and-forward using nested linear codes for the Gaussian MAC , 2015, 2015 IEEE Information Theory Workshop (ITW).
[53] Katalin Marton,et al. A coding theorem for the discrete memoryless broadcast channel , 1979, IEEE Trans. Inf. Theory.
[54] Aria Ghasemian Sahebi,et al. Distributed source coding using Abelian group codes: Extracting performance from structure , 2008, 2008 46th Annual Allerton Conference on Communication, Control, and Computing.
[55] Alon Orlitsky,et al. Coding for computing , 1995, Proceedings of IEEE 36th Annual Foundations of Computer Science.