Compute-Forward for DMCs: Simultaneous Decoding of Multiple Combinations

Algebraic network information theory is an emerging facet of network information theory, studying the achievable rates of random code ensembles that have algebraic structure, such as random linear codes. A distinguishing feature is that linear combinations of codewords can sometimes be decoded more efficiently than codewords themselves. The present work further develops this framework by studying the simultaneous decoding of multiple messages. Specifically, consider a receiver in a multi-user network that wishes to decode several messages. Simultaneous joint typicality decoding is one of the most powerful techniques for determining the fundamental limits at which reliable decoding is possible. This technique has historically been used in conjunction with random i.i.d. codebooks to establish achievable rate regions for networks. Recently, it has been shown that, in certain scenarios, nested linear codebooks in conjunction with “single-user” or sequential decoding can yield better achievable rates. For instance, the compute–forward problem examines the scenario of recovering <inline-formula> <tex-math notation="LaTeX">$L \le K$ </tex-math></inline-formula> linear combinations of transmitted codewords over a <inline-formula> <tex-math notation="LaTeX">$K$ </tex-math></inline-formula>-user multiple-access channel (MAC), and it is well established that linear codebooks can yield higher rates. This paper develops bounds for simultaneous joint typicality decoding used in conjunction with nested linear codebooks, and applies them to obtain a larger achievable region for compute–forward over a <inline-formula> <tex-math notation="LaTeX">$K$ </tex-math></inline-formula>-user discrete memoryless MAC. The key technical challenge is that competing codeword tuples that are linearly dependent on the true codeword tuple introduce statistical dependencies, which requires careful partitioning of the associated error events.

[1]  Arun Padakandla,et al.  An Achievable Rate Region for the Three-User Interference Channel Based on Coset Codes , 2014, IEEE Transactions on Information Theory.

[2]  János Körner,et al.  How to encode the modulo-two sum of binary sources (Corresp.) , 1979, IEEE Trans. Inf. Theory.

[3]  Uri Erez,et al.  The Approximate Sum Capacity of the Symmetric Gaussian $K$ -User Interference Channel , 2012, IEEE Transactions on Information Theory.

[4]  Sae-Young Chung,et al.  Capacity of the Gaussian Two-way Relay Channel to within 1/2 Bit , 2009, ArXiv.

[5]  Michael Gastpar,et al.  A joint typicality approach to compute-forward , 2015, 2015 53rd Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[6]  Yitzhak Katznelson,et al.  A (terse) introduction to linear algebra , 2007 .

[7]  Aria Ghasemian Sahebi,et al.  Abelian Group Codes for Channel Coding and Source Coding , 2015, IEEE Transactions on Information Theory.

[8]  Sung Hoon Lim,et al.  Distributed Decode–Forward for Relay Networks , 2015, IEEE Transactions on Information Theory.

[9]  Uri Erez,et al.  On general lattice quantization noise , 2008, 2008 IEEE International Symposium on Information Theory.

[10]  P. Vijay Kumar,et al.  Linear Coding Schemes for the Distributed Computation of Subspaces , 2013, IEEE Journal on Selected Areas in Communications.

[11]  Amir K. Khandani,et al.  Real Interference Alignment: Exploiting the Potential of Single Antenna Systems , 2009, IEEE Transactions on Information Theory.

[12]  Frank R. Kschischang,et al.  Communication Over Finite-Chain-Ring Matrix Channels , 2013, IEEE Transactions on Information Theory.

[13]  Sung Hoon Lim,et al.  Optimal Achievable Rates for Computation With Random Homologous Codes , 2018, 2018 IEEE International Symposium on Information Theory (ISIT).

[14]  Michael Gastpar,et al.  Cooperative strategies and capacity theorems for relay networks , 2005, IEEE Transactions on Information Theory.

[15]  Natasha Devroye,et al.  Lattice Codes for the Gaussian Relay Channel: Decode-and-Forward and Compress-and-Forward , 2011, IEEE Transactions on Information Theory.

[16]  Giuseppe Caire,et al.  Compute-and-Forward Strategies for Cooperative Distributed Antenna Systems , 2012, IEEE Transactions on Information Theory.

[17]  G. David Forney Trellis shaping , 1992, IEEE Trans. Inf. Theory.

[18]  Urs Niesen,et al.  The Degrees of Freedom of Compute-and-Forward , 2011, IEEE Transactions on Information Theory.

[19]  Andrea J. Goldsmith,et al.  Multihop Analog Network Coding via Amplify-and-Forward: The High SNR Regime , 2012, IEEE Transactions on Information Theory.

[20]  Abbas El Gamal,et al.  Capacity theorems for the relay channel , 1979, IEEE Trans. Inf. Theory.

[21]  Urs Niesen,et al.  Interference alignment: From degrees-of-freedom to constant-gap capacity approximations , 2011, 2012 IEEE International Symposium on Information Theory Proceedings.

[22]  Arun Padakandla,et al.  Achievable rate region for three user discrete broadcast channel based on coset codes , 2012, 2013 IEEE International Symposium on Information Theory.

[23]  Aaron B. Wagner On Distributed Compression of Linear Functions , 2011, IEEE Transactions on Information Theory.

[24]  Thomas M. Cover,et al.  Elements of information theory (2. ed.) , 2006 .

[25]  Michael Gastpar,et al.  Polynomially Solvable Instances of the Shortest and Closest Vector Problems With Applications to Compute-and-Forward , 2015, IEEE Transactions on Information Theory.

[26]  Young-Han Kim,et al.  Homologous codes for multiple access channels , 2017, 2017 IEEE International Symposium on Information Theory (ISIT).

[27]  Michael Gastpar,et al.  Compute-and-Forward: Harnessing Interference Through Structured Codes , 2009, IEEE Transactions on Information Theory.

[28]  I-Hsiang Wang Approximate Capacity of the Dirty Multiple-Access Channel With Partial State Information at the Encoders , 2012, IEEE Transactions on Information Theory.

[29]  Giacomo Como,et al.  The Capacity of Finite Abelian Group Codes Over Symmetric Memoryless Channels , 2009, IEEE Transactions on Information Theory.

[30]  Aylin Yener,et al.  Providing Secrecy With Structured Codes: Tools and Applications to Two-User Gaussian Channels , 2009, ArXiv.

[31]  Ilan Shomorony,et al.  Degrees of Freedom of Two-Hop Wireless Networks: Everyone Gets the Entire Cake , 2012, IEEE Transactions on Information Theory.

[32]  Alexander Sprintson,et al.  Joint Physical Layer Coding and Network Coding for Bidirectional Relaying , 2008, IEEE Transactions on Information Theory.

[33]  Michael Gastpar,et al.  Gaussian Multiple Access via Compute-and-Forward , 2014, IEEE Transactions on Information Theory.

[34]  Gerhard Kramer,et al.  Short Message Noisy Network Coding With a Decode–Forward Option , 2013, IEEE Transactions on Information Theory.

[35]  Michael Gastpar,et al.  Maximum Throughput Gain of Compute-and-Forward for Multiple Unicast , 2014, IEEE Communications Letters.

[36]  Abbas El Gamal,et al.  Optimal Achievable Rates for Interference Networks With Random Codes , 2012, IEEE Transactions on Information Theory.

[37]  Shigeki Miyake Coding Theorems for Point-to-Point Communication Systems using Sparse Matrix Codes , 2010 .

[38]  Sae-Young Chung,et al.  Capacity of the Gaussian Two-Way Relay Channel to Within ${1\over 2}$ Bit , 2009, IEEE Transactions on Information Theory.

[39]  Abhay Parekh,et al.  The Approximate Capacity of the Many-to-One and One-to-Many Gaussian Interference Channels , 2008, IEEE Transactions on Information Theory.

[40]  S. Sandeep Pradhan,et al.  Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function , 2007, IEEE Transactions on Information Theory.

[41]  Sung Hoon Lim,et al.  A Unified Approach to Hybrid Coding , 2015, IEEE Transactions on Information Theory.

[42]  Uri Erez,et al.  Lattice Strategies for the Dirty Multiple Access Channel , 2007, IEEE Transactions on Information Theory.

[43]  Alon Orlitsky,et al.  Coding for computing , 1995, Proceedings of IEEE 36th Annual Foundations of Computer Science.

[44]  R. Gray Entropy and Information Theory , 1990, Springer New York.

[45]  Aria Ghasemian Sahebi,et al.  Distributed source coding using Abelian group codes: Extracting performance from structure , 2008, 2008 46th Annual Allerton Conference on Communication, Control, and Computing.

[46]  Yihong Wu,et al.  Equivalence of Additive-Combinatorial Linear Inequalities for Shannon Entropy and Differential Entropy , 2016, IEEE Transactions on Information Theory.

[47]  Zixiang Xiong,et al.  Distributed compression of linear functions: Partial sum-rate tightness and gap to optimal sum-rate , 2011, 2011 IEEE International Symposium on Information Theory Proceedings.

[48]  Giuseppe Caire,et al.  Expanding the Compute-and-Forward Framework: Unequal Powers, Signal Levels, and Multiple Linear Combinations , 2015, IEEE Transactions on Information Theory.

[49]  Carl D. Meyer,et al.  Matrix Analysis and Applied Linear Algebra , 2000 .

[50]  S. Sandeep Pradhan,et al.  Distributed Source Coding Using Abelian Group Codes: A New Achievable Rate-Distortion Region , 2011, IEEE Transactions on Information Theory.

[51]  Uri Erez,et al.  On the Robustness of Lattice Interference Alignment , 2013, IEEE Transactions on Information Theory.

[52]  Sae-Young Chung,et al.  Noisy Network Coding , 2010, IEEE Transactions on Information Theory.

[53]  Mohammad Reza Aref,et al.  Slepian–Wolf Coding Over Cooperative Relay Networks , 2011, IEEE Transactions on Information Theory.

[54]  Ram Zamir,et al.  On the Loss of Single-Letter Characterization: The Dirty Multiple Access Channel , 2009, IEEE Transactions on Information Theory.