Network Coding and Distributed Compression over Large Networks: Some Basic Principles

The fields of Network Coding and Distributed Compression have focused primarily on finding the capacity for families of problems defined by either a broad class of networks topologies (e.g., directed, acyclic networks) under a narrow class of demands (e.g., multicast), or a specific network topology (e.g. three-node networks) under different types of demands (e.g. Slepian-Wolf, Ahlswede-Korner). Given the difficulty of the general problem, it is not surprising that the collection of networks that have been fully solved to date is still very small. This work investigates several new approaches to bounding the achievable rate region for general network source coding problems - reducing a network to an equivalent network or collection of networks, investigating the effect of feedback on achievable rates, and characterizing the role of side information. We describe two approaches aimed at simplifying the capacity calculations in a large network. First, we prove the optimality of separation between network coding and channel coding for networks of point-to-point channels with a Byzantine adversary. Next, we give a strategy for calculating the capacity of an error-free network by decomposing that network into smaller networks. We show that this strategy is optimal for a large class of networks and give a bound for other cases. To date, the role of feedback in network source coding has received very little attention. We present several examples of networks that demonstrate that feedback can increases the set of achievable rates in both lossy and lossless network source coding settings. We derive general upper and lower bounds on the rate regions for networks with limited feedback that demonstrate a fundamental tradeoff between the forward rate and the feedback rate. For zero error source coding with limited feedback and decoder side information, we derive the exact tradeoff between the forward rate and the feedback rate for several classes of sources. A surprising result is that even zero rate feedback can reduce the optimal forward rate by an arbitrary factor. Side information can be used to reduce the rates required for reliable information. We precisely characterize the exact achievable region for multicast networks with side information at the sinks and find upper and lower bounds on the achievable rate region for other demand types.

[1]  Mayank Bakshi,et al.  On feedback in network source coding , 2009, 2009 IEEE International Symposium on Information Theory.

[2]  Lang Tong,et al.  Nonlinear network coding is necessary to combat general Byzantine attacks , 2009, 2009 47th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[3]  Rudolf Ahlswede,et al.  Source coding with side information and a converse for degraded broadcast channels , 1975, IEEE Trans. Inf. Theory.

[4]  Alon Orlitsky,et al.  Average and randomized communication complexity , 1990, IEEE Trans. Inf. Theory.

[5]  Imre Csiszár Linear codes for sources and source networks: Error exponents, universal coding , 1982, IEEE Trans. Inf. Theory.

[6]  Muriel Médard,et al.  On a theory of network equivalence , 2009, 2009 IEEE Information Theory Workshop on Networking and Information Theory.

[7]  H. S. WITSENHAUSEN,et al.  The zero-error side information problem and chromatic numbers (Corresp.) , 1976, IEEE Trans. Inf. Theory.

[8]  David A. Huffman,et al.  A method for the construction of minimum-redundancy codes , 1952, Proceedings of the IRE.

[9]  A. Orlitsky,et al.  Communication complexity , 1988 .

[10]  Aaron D. Wyner,et al.  The rate-distortion function for source coding with side information at the decoder , 1976, IEEE Trans. Inf. Theory.

[11]  Michelle Effros,et al.  Functional Source Coding for Networks with Receiver Side Information ∗ , 2004 .

[12]  Shirin Jalali,et al.  On the separation of lossy source-network coding and channel coding in wireline networks , 2010, 2010 IEEE International Symposium on Information Theory.

[13]  Kenneth Rose,et al.  On zero-error source coding with decoder side information , 2003, IEEE Trans. Inf. Theory.

[14]  Muriel Médard,et al.  A Theory of Network Equivalence— Part I: Point-to-Point Channels , 2011, IEEE Transactions on Information Theory.

[15]  Hirosuke Yamamoto,et al.  Wyner-Ziv theory for a general function of the correlated sources , 1982, IEEE Trans. Inf. Theory.

[16]  Noga Alon,et al.  Source coding and graph entropies , 1996, IEEE Trans. Inf. Theory.

[17]  Abbas El Gamal,et al.  Capacity theorems for the relay channel , 1979, IEEE Trans. Inf. Theory.

[18]  Pramod Viswanath,et al.  Rate Region of the Quadratic Gaussian Two-Encoder Source-Coding Problem , 2006, ISIT.

[19]  Lang Tong,et al.  Polytope codes against adversaries in networks , 2010, ISIT.

[20]  Robert G. Gallager,et al.  Low-density parity-check codes , 1962, IRE Trans. Inf. Theory.

[21]  Tracey Ho,et al.  Network Error Correction With Unequal Link Capacities , 2010, IEEE Transactions on Information Theory.

[22]  Rudolf Ahlswede,et al.  Network information flow , 2000, IEEE Trans. Inf. Theory.

[23]  Jack K. Wolf,et al.  The capacity region of a multiple-access discrete memoryless channel can increase with feedback (Corresp.) , 1975, IEEE Trans. Inf. Theory.

[24]  Sui Tung,et al.  Multiterminal source coding (Ph.D. Thesis abstr.) , 1978, IEEE Trans. Inf. Theory.

[25]  Christina Fragouli,et al.  Coding schemes for line networks , 2005, Proceedings. International Symposium on Information Theory, 2005. ISIT 2005..

[26]  Imre Csiszár,et al.  Information Theory - Coding Theorems for Discrete Memoryless Systems, Second Edition , 2011 .

[27]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[28]  Alon Orlitsky,et al.  Coding for computing , 1995, Proceedings of IEEE 36th Annual Foundations of Computer Science.

[29]  Jack K. Wolf,et al.  Noiseless coding of correlated information sources , 1973, IEEE Trans. Inf. Theory.

[30]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[31]  Thomas M. Cover,et al.  Broadcast channels , 1972, IEEE Trans. Inf. Theory.

[32]  En-Hui Yang,et al.  Universal Multiterminal Source Coding Algorithms With Asymptotically Zero Feedback: Fixed Database Case , 2008, IEEE Transactions on Information Theory.

[33]  Alon Orlitsky,et al.  Average-case interactive communication , 1992, IEEE Trans. Inf. Theory.

[34]  WeiHsin Gu On Achievable Rate Regions for Source Coding over Networks , 2009 .

[35]  Prakash Ishwar,et al.  Two-terminal distributed source coding with alternating messages for function computation , 2008, 2008 IEEE International Symposium on Information Theory.

[36]  Michelle Effros,et al.  A Partial Solution for Lossless Source Coding with Coded Side Information , 2006, 2006 IEEE Information Theory Workshop - ITW '06 Punta del Este.

[37]  Claude E. Shannon,et al.  The zero error capacity of a noisy channel , 1956, IRE Trans. Inf. Theory.

[38]  R. Gallager Information Theory and Reliable Communication , 1968 .

[39]  Erdal Arikan,et al.  Channel Polarization: A Method for Constructing Capacity-Achieving Codes for Symmetric Binary-Input Memoryless Channels , 2008, IEEE Transactions on Information Theory.

[40]  Alon Orlitsky Two messages are almost optimal for conveying information , 1990, PODC '90.

[41]  Tracey Ho,et al.  Resilient network coding in the presence of Byzantine adversaries , 2007, IEEE INFOCOM 2007 - 26th IEEE International Conference on Computer Communications.

[42]  Tracey Ho,et al.  A Random Linear Network Coding Approach to Multicast , 2006, IEEE Transactions on Information Theory.

[43]  Abraham Lempel,et al.  A universal algorithm for sequential data compression , 1977, IEEE Trans. Inf. Theory.

[44]  D. R. Fulkerson,et al.  Maximal Flow Through a Network , 1956 .

[45]  Claude E. Shannon,et al.  The Mathematical Theory of Communication , 1950 .

[46]  Amir Salman Avestimehr,et al.  On networks with side information , 2009, 2009 IEEE International Symposium on Information Theory.