Capacity Bounds for Diamond Networks With an Orthogonal Broadcast Channel

A class of diamond networks is studied where the broadcast component is orthogonal and modeled by two independent bit-pipes. New upper and lower bounds on the capacity are derived. The proof technique for the upper bound generalizes the bounding techniques of Ozarow for the Gaussian multiple description problem (1981) and Kang and Liu for the Gaussian diamond network (2011). The lower bound is based on Marton's coding technique and superposition coding. The bounds are evaluated for Gaussian and binary adder multiple access channels (MACs). For Gaussian MACs, both the lower and upper bounds strengthen the Kang-Liu bounds and establish capacity for interesting ranges of bit-pipe capacities. For binary adder MACs, the capacity is established for all the ranges of bit-pipe capacities.

[1]  Emre Telatar,et al.  A new entropy power inequality for integer-valued random variables , 2013, 2013 IEEE International Symposium on Information Theory.

[2]  Shlomo Shamai,et al.  A binary analog to the entropy-power inequality , 1990, IEEE Trans. Inf. Theory.

[3]  Shlomo Shamai,et al.  Extension of an entropy property for binary input memoryless symmetric channels , 1989, IEEE Trans. Inf. Theory.

[4]  Nan Liu,et al.  The Gaussian multiple access diamond channel , 2011, 2011 IEEE International Symposium on Information Theory Proceedings.

[5]  Hans S. Witsenhausen,et al.  Entropy inequalities for discrete channels , 1974, IEEE Trans. Inf. Theory.

[6]  Shlomo Shamai,et al.  Communication Via Decentralized Processing , 2005, IEEE Transactions on Information Theory.

[7]  Thomas M. Cover,et al.  Network Information Theory , 2001 .

[8]  Suhas N. Diggavi,et al.  Wireless Network Information Flow: A Deterministic Approach , 2009, IEEE Transactions on Information Theory.

[9]  Venkat Anantharam,et al.  An improved outer bound for multiterminal source coding , 2008, IEEE Transactions on Information Theory.

[10]  Gerhard Kramer,et al.  Capacity of two-relay diamond networks with rate-limited links to the relays and a binary adder multiple access channel , 2016, 2016 IEEE International Symposium on Information Theory (ISIT).

[11]  Frans M. J. Willems,et al.  The discrete memoryless multiple-access channel with cribbing encoders , 1985, IEEE Trans. Inf. Theory.

[12]  D. Traskov,et al.  Reliable Communication in Networks with Multi-access Interference , 2007, 2007 IEEE Information Theory Workshop.

[13]  Axthonv G. Oettinger,et al.  IEEE Transactions on Information Theory , 1998 .

[14]  Naresh Sharma,et al.  Entropy power inequality for a family of discrete random variables , 2011, 2011 IEEE International Symposium on Information Theory Proceedings.

[15]  Abbas El Gamal,et al.  Network Information Theory , 2021, 2021 IEEE 3rd International Conference on Advanced Trends in Information Theory (ATIT).

[16]  Christophe Vignat,et al.  AN ENTROPY POWER INEQUALITY FOR THE BINOMIAL FAMILY , 2003 .

[17]  Andries P. Hekstra,et al.  Dependence balance bounds for single-output two-way channels , 1989, IEEE Trans. Inf. Theory.

[18]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[19]  Varun Jog,et al.  The Entropy Power Inequality and Mrs. Gerber's Lemma for Groups of Order 2n , 2014, IEEE Trans. Inf. Theory.

[20]  Varun Jog,et al.  The Entropy Power Inequality and Mrs. Gerber's Lemma for groups of order 2n , 2013, 2013 IEEE International Symposium on Information Theory.

[21]  Christina Fragouli,et al.  Optimizing Quantize-Map-and-Forward relaying for Gaussian diamond networks , 2012, 2012 IEEE Information Theory Workshop.

[22]  Jun Chen,et al.  A Lower Bound on the Sum Rate of Multiple Description Coding With Symmetric Distortion Constraints , 2014, IEEE Transactions on Information Theory.

[23]  Charles R. Johnson,et al.  Matrix analysis , 1985, Statistical Inference for Engineers and Data Scientists.

[24]  Aaron D. Wyner,et al.  A theorem on the entropy of certain binary sequences and applications-I , 1973, IEEE Trans. Inf. Theory.

[25]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[26]  Brett Schein,et al.  Distributed coordination in network information theory , 2001 .

[27]  Suhas N. Diggavi,et al.  The Approximate Capacity of the Gaussian n-Relay Diamond Network , 2013, IEEE Transactions on Information Theory.

[28]  Gerhard Kramer,et al.  Capacity bounds for a class of diamond networks , 2014, 2014 IEEE International Symposium on Information Theory.

[29]  Rudolf Ahlswede,et al.  On the connection between the entropies of input and output distributions of discrete memoryless channels , 1977 .

[30]  Wei Kang,et al.  Capacity of a Class of Diamond Channels , 2008, IEEE Transactions on Information Theory.

[31]  Joy A. Thomas,et al.  Feedback can at most double Gaussian multiple access channel capacity , 1987, IEEE Trans. Inf. Theory.

[32]  L. Ozarow,et al.  On a source-coding problem with two channels and three receivers , 1980, The Bell System Technical Journal.

[33]  Ayfer Özgür,et al.  Achieving the capacity of the N-relay Gaussian diamond network within logn bits , 2012, 2012 IEEE Information Theory Workshop.

[34]  Tie Liu,et al.  An Extremal Inequality Motivated by Multiterminal Information-Theoretic Problems , 2006, IEEE Transactions on Information Theory.

[35]  Fan Cheng,et al.  Generalization of Mrs. Gerber's Lemma , 2014, Commun. Inf. Syst..

[36]  Oliver Johnson,et al.  Monotonicity, Thinning, and Discrete Versions of the Entropy Power Inequality , 2009, IEEE Transactions on Information Theory.

[37]  Aaron D. Wyner,et al.  A theorem on the entropy of certain binary sequences and applications-II , 1973, IEEE Trans. Inf. Theory.

[38]  Suhas N. Diggavi,et al.  The Approximate Capacity of the Gaussian $N$-Relay Diamond Network , 2013, IEEE Trans. Inf. Theory.

[39]  Sennur Ulukus,et al.  Dependence Balance Based Outer Bounds for Gaussian Networks With Cooperation and Feedback , 2011, IEEE Transactions on Information Theory.

[40]  L. Goddard Information Theory , 1962, Nature.