Maximum Entropy Functions: Approximate Gacs-Korner for Distributed Compression

Consider two correlated sources X and Y generated from a joint distribution pX,Y . Their Gacs-Korner Common Information, a measure of common information that exploits the combinatorial structure of the distribution pX,Y , leads to a source decomposition that exhibits the latent common parts in X and Y . Using this source decomposition we construct an efficient distributed compression scheme, which can be efficiently used in the network setting as well. Then, we relax the combinatorial conditions on the source distribution, which results in an efficient scheme with a helper node, which can be thought of as a front-end cache. This relaxation leads to an inherent trade-off between the rate of the helper and the rate reduction at the sources, which we capture by a notion of optimal decomposition. We formulate this as an approximate Gacs-Korner optimization. We then discuss properties of this optimization, and provide connections with the maximal correlation coefficient, as well as an efficient algorithm, both through the application of spectral graph theory to the induced bipartite graph of pX,Y .

[1]  Daniel A. Spielman,et al.  Spectral Graph Theory and its Applications , 2007, 48th Annual IEEE Symposium on Foundations of Computer Science (FOCS'07).

[2]  Aditya Ramamoorthy,et al.  Separating distributed source coding from network coding , 2006, IEEE Transactions on Information Theory.

[3]  I. Stancu-Minasian Nonlinear Fractional Programming , 1997 .

[4]  Vinod M. Prabhakaran,et al.  Assisted Common Information With an Application to Secure Two-Party Sampling , 2014, IEEE Transactions on Information Theory.

[5]  Muriel Médard,et al.  An exploration of the role of principal inertia components in information theory , 2014, 2014 IEEE Information Theory Workshop (ITW 2014).

[6]  Reza Modarres,et al.  Measures of Dependence , 2011, International Encyclopedia of Statistical Science.

[7]  H. Witsenhausen ON SEQUENCES OF PAIRS OF DEPENDENT RANDOM VARIABLES , 1975 .

[8]  Muriel Médard,et al.  Efficient coding for multi-source networks using Gács-Körner common information , 2016, 2016 International Symposium on Information Theory and Its Applications (ISITA).

[9]  Kenneth Rose,et al.  On zero-error source coding with decoder side information , 2003, IEEE Trans. Inf. Theory.

[10]  Jack K. Wolf,et al.  Noiseless coding of correlated information sources , 1973, IEEE Trans. Inf. Theory.

[11]  H. S. WITSENHAUSEN,et al.  The zero-error side information problem and chromatic numbers (Corresp.) , 1976, IEEE Trans. Inf. Theory.