Quasi Structured Codes for Multi-Terminal Communications

A new class of structured codes called quasi group codes (QGCs) is introduced. A QGC is a subset of a group code. In contrast with the group codes, QGCs are not closed under group addition. The parameters of the QGC can be chosen, such that the size of <inline-formula> <tex-math notation="LaTeX">$\mathcal {C}+\mathcal {C}$ </tex-math></inline-formula> is equal to any number between <inline-formula> <tex-math notation="LaTeX">$|\mathcal {C}|$ </tex-math></inline-formula> and <inline-formula> <tex-math notation="LaTeX">$|\mathcal {C}|^{2}$ </tex-math></inline-formula>. We analyze the performance of a specific class of QGCs. This class of QGCs is constructed by assigning single-letter distributions to the indices of the codewords in a group code. Then, the QGC is defined as the set of codewords whose index is in the typical set corresponding to these single-letter distributions. The asymptotic performance limits of this class of QGCs are characterized using single-letter information quantities. Corresponding covering and packing bounds are derived. It is shown that the point-to-point channel capacity and optimal rate-distortion function are achievable using QGCs. Coding strategies based on QGCs are introduced for three fundamental multi-terminal problems: the Körner-Marton problem for modulo prime-power sums, computation over the multiple access channel (MAC), and MAC with distributed states. For each problem, a single-letter achievable rate-region is derived. It is shown, through examples, that the coding strategies improve upon the previous strategies based on the unstructured codes, linear codes, and group codes.

[1]  Giuseppe Caire,et al.  On Interference Networks Over Finite Fields , 2013, IEEE Transactions on Information Theory.

[2]  Aria Ghasemian Sahebi,et al.  Abelian Group Codes for Channel Coding and Source Coding , 2015, IEEE Transactions on Information Theory.

[3]  Arun Padakandla,et al.  A new achievable rate region for the 3-user discrete memoryless interference channel , 2012, 2012 IEEE International Symposium on Information Theory Proceedings.

[4]  Urs Niesen,et al.  Interference alignment: From degrees-of-freedom to constant-gap capacity approximations , 2011, 2012 IEEE International Symposium on Information Theory Proceedings.

[5]  Arun Padakandla,et al.  Achievable rate region for three user discrete broadcast channel based on coset codes , 2012, 2013 IEEE International Symposium on Information Theory.

[6]  Alexander Barg,et al.  Polar codes for q-ary channels, q =2r , 2011, 2012 IEEE International Symposium on Information Theory Proceedings.

[7]  Giacomo Como,et al.  The Capacity of Finite Abelian Group Codes Over Symmetric Memoryless Channels , 2009, IEEE Transactions on Information Theory.

[8]  Rudolf Ahlswede,et al.  Bounds on Algebraic Code Capacities for Noisy Channels. I , 1971, Inf. Control..

[9]  Michael Gastpar,et al.  Computation Over Multiple-Access Channels , 2007, IEEE Transactions on Information Theory.

[10]  Michael Gastpar,et al.  Linear Function Computation in Networks: Duality and Constant Gap Results , 2013, IEEE Journal on Selected Areas in Communications.

[11]  Te Sun Han,et al.  A dichotomy of functions F(X, Y) of correlated sources (X, Y) , 1987, IEEE Trans. Inf. Theory.

[12]  S. Sandeep Pradhan,et al.  New sufficient conditions for Multiple-Access Channel with correlated sources , 2016, 2016 IEEE International Symposium on Information Theory (ISIT).

[13]  Emre Telatar,et al.  Polar Codes for the $m$-User Multiple Access Channel , 2012, IEEE Transactions on Information Theory.

[14]  Rudolf Ahlswede,et al.  On source coding with side information via a multiple-access channel and related problems in multi-user information theory , 1983, IEEE Trans. Inf. Theory.

[15]  Sriram Vishwanath,et al.  Achievable Rates for $K$-User Gaussian Interference Channels , 2011, IEEE Transactions on Information Theory.

[16]  S. Sandeep Pradhan,et al.  On the necessity of structured codes for communications over MAC with feedback , 2017, 2017 IEEE International Symposium on Information Theory (ISIT).

[17]  R. Ahlswede Group Codes do not Achieve Shannon's Channel Capacity for General Discrete Channels , 1971 .

[18]  S. Sandeep Pradhan,et al.  A new achievable rate region for multiple-access channel with states , 2017, 2017 IEEE International Symposium on Information Theory (ISIT).

[19]  Thomas M. Cover,et al.  Network Information Theory , 2001 .

[20]  Emre Telatar,et al.  Polar Codes for the Two-User Multiple-Access Channel , 2010, IEEE Transactions on Information Theory.

[21]  Uri Erez,et al.  Lattice Strategies for the Dirty Multiple Access Channel , 2007, IEEE Transactions on Information Theory.

[22]  Aria Ghasemian Sahebi,et al.  Multilevel Channel Polarization for Arbitrary Discrete Memoryless Channels , 2013, IEEE Transactions on Information Theory.

[23]  S. Sandeep Pradhan,et al.  Distributed Source Coding Using Abelian Group Codes: A New Achievable Rate-Distortion Region , 2011, IEEE Transactions on Information Theory.

[24]  Syed Ali Jafar Capacity With Causal and Noncausal Side Information: A Unified View , 2006, IEEE Transactions on Information Theory.

[25]  Emre Telatar,et al.  Polarization for arbitrary discrete memoryless channels , 2009, 2009 IEEE Information Theory Workshop.

[26]  Massimo Franceschetti,et al.  Linear Codes, Target Function Classes, and Network Computing Capacity , 2013, IEEE Transactions on Information Theory.

[27]  Sriram Vishwanath,et al.  Generalized Degrees of Freedom of the Symmetric Gaussian $K$ User Interference Channel , 2010, IEEE Transactions on Information Theory.

[28]  Michael Gastpar,et al.  A Joint Typicality Approach to Algebraic Network Information Theory , 2016, ArXiv.

[29]  S. Sandeep Pradhan,et al.  Trade-off between communication and cooperation in the Interference Channel , 2016, 2016 IEEE International Symposium on Information Theory (ISIT).

[30]  S. Sandeep Pradhan,et al.  New lattice codes for multiple-descriptions , 2015, 2015 IEEE International Symposium on Information Theory (ISIT).

[31]  Hans-Andrea Loeliger,et al.  Signal sets matched to groups , 1991, IEEE Trans. Inf. Theory.

[32]  Shlomo Shamai,et al.  A layered lattice coding scheme for a class of three user Gaussian interference channels , 2008, 2008 46th Annual Allerton Conference on Communication, Control, and Computing.

[33]  I. Blake,et al.  Group Codes for the Gaussian Channel , 1975 .

[34]  Abhay Parekh,et al.  The Approximate Capacity of the Many-to-One and One-to-Many Gaussian Interference Channels , 2008, IEEE Transactions on Information Theory.

[35]  Alexander Barg,et al.  Polar Codes for $q$-Ary Channels, $q=2^{r}$ , 2013, IEEE Trans. Inf. Theory.

[36]  Rudolf Ahlswede,et al.  Bounds on Algebraic Code Capacities for Noisy Channels. II , 1971, Inf. Control..

[37]  János Körner,et al.  How to encode the modulo-two sum of binary sources (Corresp.) , 1979, IEEE Trans. Inf. Theory.

[38]  Te Sun Han,et al.  A unified achievable rate region for a general class of multiterminal source coding systems , 1980, IEEE Trans. Inf. Theory.

[39]  S. Sandeep Pradhan,et al.  Beyond group capacity in multi-terminal communications , 2015, 2015 IEEE International Symposium on Information Theory (ISIT).

[40]  Uri Erez,et al.  The Approximate Sum Capacity of the Symmetric Gaussian $K$ -User Interference Channel , 2012, IEEE Transactions on Information Theory.

[41]  Arun Padakandla,et al.  Computing sum of sources over an arbitrary multiple access channel , 2013, 2013 IEEE International Symposium on Information Theory.

[42]  L. Goddard Information Theory , 1962, Nature.

[43]  Ram Zamir,et al.  On the Loss of Single-Letter Characterization: The Dirty Multiple Access Channel , 2009, IEEE Transactions on Information Theory.

[44]  Jack K. Wolf,et al.  Noiseless coding of correlated information sources , 1973, IEEE Trans. Inf. Theory.

[45]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .