A Joint Typicality Approach to Compute–Forward
暂无分享,去创建一个
[1] Arun Padakandla,et al. Computing sum of sources over an arbitrary multiple access channel , 2013, 2013 IEEE International Symposium on Information Theory.
[2] Natasha Devroye,et al. Lattice Codes for the Gaussian Relay Channel: Decode-and-Forward and Compress-and-Forward , 2011, IEEE Transactions on Information Theory.
[3] Uri Erez,et al. On the Robustness of Lattice Interference Alignment , 2013, IEEE Transactions on Information Theory.
[4] Aaron D. Wyner,et al. Channels with Side Information at the Transmitter , 1993 .
[5] A. Rényi. On the dimension and entropy of probability distributions , 1959 .
[6] Amir K. Khandani,et al. Real Interference Alignment: Exploiting the Potential of Single Antenna Systems , 2009, IEEE Transactions on Information Theory.
[7] Aylin Yener,et al. Providing Secrecy With Structured Codes: Two-User Gaussian Channels , 2014, IEEE Transactions on Information Theory.
[8] Ilan Shomorony,et al. Degrees of Freedom of Two-Hop Wireless Networks: Everyone Gets the Entire Cake , 2012, IEEE Transactions on Information Theory.
[9] Meir Feder,et al. The Random Coding Bound Is Tight for the Average Linear Code or Lattice , 2013, IEEE Transactions on Information Theory.
[10] Urs Niesen,et al. Interference alignment: From degrees-of-freedom to constant-gap capacity approximations , 2012, ISIT.
[11] Ina Fourie,et al. Entropy and Information Theory (2nd ed.) , 2012 .
[12] Giuseppe Caire,et al. Compute-and-Forward Strategies for Cooperative Distributed Antenna Systems , 2012, IEEE Transactions on Information Theory.
[13] Urs Niesen,et al. The degrees of freedom of compute-and-forward , 2011, ISIT.
[14] Katalin Marton,et al. A coding theorem for the discrete memoryless broadcast channel , 1979, IEEE Trans. Inf. Theory.
[15] 三宅 茂樹. Coding theorems for point-to-point communication systems using sparse matrix codes , 2010 .
[16] Ram Zamir,et al. On the Loss of Single-Letter Characterization: The Dirty Multiple Access Channel , 2009, IEEE Transactions on Information Theory.
[17] Arun Padakandla,et al. Achievable rate region for three user discrete broadcast channel based on coset codes , 2012, 2013 IEEE International Symposium on Information Theory.
[18] Aaron B. Wagner. On Distributed Compression of Linear Functions , 2011, IEEE Transactions on Information Theory.
[19] Wenbo He,et al. Integer-forcing interference alignment: Iterative optimization via aligned lattice reduction , 2015, 2015 IEEE International Symposium on Information Theory (ISIT).
[20] Alon Orlitsky,et al. Coding for computing , 1995, Proceedings of IEEE 36th Annual Foundations of Computer Science.
[21] Sung Hoon Lim,et al. A Unified Approach to Hybrid Coding , 2015, IEEE Transactions on Information Theory.
[22] Robert M. Gray,et al. Entropy and Information Theory -2/E. , 2014 .
[23] Uri Erez,et al. Lattice Strategies for the Dirty Multiple Access Channel , 2007, IEEE Transactions on Information Theory.
[24] Hesham El Gamal,et al. On the Optimality of Lattice Coding and Decoding in Multiple Access Channels , 2007, 2007 IEEE International Symposium on Information Theory.
[25] Michael Gastpar,et al. Asymmetric Compute-and-Forward with CSIT , 2014, ArXiv.
[26] Peter Harremoës,et al. Rényi Divergence and Kullback-Leibler Divergence , 2012, IEEE Transactions on Information Theory.
[27] Edward C. Posner,et al. Random coding strategies for minimum entropy , 1975, IEEE Trans. Inf. Theory.
[28] S. Sandeep Pradhan,et al. Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function , 2007, AAECC.
[29] Michael Gastpar,et al. Gaussian Multiple Access via Compute-and-Forward , 2014, IEEE Transactions on Information Theory.
[30] Michael Gastpar,et al. Maximum Throughput Gain of Compute-and-Forward for Multiple Unicast , 2014, IEEE Communications Letters.
[31] Abbas El Gamal,et al. Optimal Achievable Rates for Interference Networks With Random Codes , 2012, IEEE Transactions on Information Theory.
[32] Michael Gastpar,et al. Compute-and-forward using nested linear codes for the Gaussian MAC , 2015, 2015 IEEE Information Theory Workshop (ITW).
[33] I-Hsiang Wang. Approximate Capacity of the Dirty Multiple-Access Channel With Partial State Information at the Encoders , 2012, IEEE Transactions on Information Theory.
[34] Alexander Sprintson,et al. Joint Physical Layer Coding and Network Coding for Bidirectional Relaying , 2008, IEEE Transactions on Information Theory.
[35] Sae-Young Chung,et al. Capacity of the Gaussian Two-way Relay Channel to within 1/2 Bit , 2009, ArXiv.
[36] S. Sandeep Pradhan,et al. Distributed Source Coding Using Abelian Group Codes: A New Achievable Rate-Distortion Region , 2011, IEEE Transactions on Information Theory.
[37] Andrew Thangaraj,et al. Secure Compute-and-Forward in , 2015 .
[38] János Körner,et al. How to encode the modulo-two sum of binary sources (Corresp.) , 1979, IEEE Trans. Inf. Theory.
[39] Imre Csiszár,et al. Information Theory - Coding Theorems for Discrete Memoryless Systems, Second Edition , 2011 .
[40] R. Ahlswede. Group Codes do not Achieve Shannon's Channel Capacity for General Discrete Channels , 1971 .
[41] P. Vijay Kumar,et al. Linear Coding Schemes for the Distributed Computation of Subspaces , 2013, IEEE Journal on Selected Areas in Communications.
[42] Yihong Wu,et al. Equivalence of Additive-Combinatorial Linear Inequalities for Shannon Entropy and Differential Entropy , 2016, IEEE Transactions on Information Theory.
[43] Zixiang Xiong,et al. Distributed compression of linear functions: Partial sum-rate tightness and gap to optimal sum-rate , 2011, 2011 IEEE International Symposium on Information Theory Proceedings.
[44] Giuseppe Caire,et al. Expanding the Compute-and-Forward Framework: Unequal Powers, Signal Levels, and Multiple Linear Combinations , 2015, IEEE Transactions on Information Theory.
[45] Arun Padakandla,et al. An Achievable Rate Region for the Three-User Interference Channel Based on Coset Codes , 2014, IEEE Transactions on Information Theory.
[46] Michael Gastpar,et al. Integer-forcing linear receivers , 2010, 2010 IEEE International Symposium on Information Theory.
[47] Uri Erez,et al. The Approximate Sum Capacity of the Symmetric Gaussian $K$ -User Interference Channel , 2012, IEEE Transactions on Information Theory.
[48] Uri Erez,et al. Successive integer-forcing and its sum-rate optimality , 2013, 2013 51st Annual Allerton Conference on Communication, Control, and Computing (Allerton).
[49] Michael Gastpar,et al. Compute-and-Forward: Harnessing Interference Through Structured Codes , 2009, IEEE Transactions on Information Theory.
[50] David Tse,et al. Interference neutralization in distributed lossy source coding , 2010, 2010 IEEE International Symposium on Information Theory.
[51] Sae-Young Chung,et al. Capacity of the Gaussian Two-Way Relay Channel to Within ${1\over 2}$ Bit , 2009, IEEE Transactions on Information Theory.
[52] Abhay Parekh,et al. The Approximate Capacity of the Many-to-One and One-to-Many Gaussian Interference Channels , 2008, IEEE Transactions on Information Theory.
[53] Uri Erez,et al. The Approximate Sum Capacity of the Symmetric Gaussian $K$ -User Interference Channel , 2014, IEEE Trans. Inf. Theory.
[54] Frank R. Kschischang,et al. An Algebraic Approach to Physical-Layer Network Coding , 2010, IEEE Transactions on Information Theory.