Some Results on Distributed Source Coding for Interactive Function Computation

A two-terminal interactive distributed source coding problem with alternating messages for function computation at both locations is studied. For any number of messages, a computable characterization of the rate region is provided in terms of single-letter information measures. While interaction is useless in terms of the minimum sum-rate for lossless source reproduction at one or both locations, the gains can be arbitrarily large for function computation even when the sources are independent. For a class of sources and functions, interaction is shown to be useless, even with infinite messages, when a function has to be computed at only one location, but is shown to be useful, if functions have to be computed at both locations. For computing the Boolean AND function of two independent Bernoulli sources at both locations, an achievable infinite-message sum-rate with infinitesimal-rate messages is derived in terms of a 2-D definite integral and a rate-allocation curve. The benefit of interaction is highlighted in multiterminal function computation problem through examples. For networks with a star topology, multiple rounds of interactive coding is shown to decrease the scaling law of the total network rate by an order of magnitude as the network grows.

[1]  Prakash Ishwar,et al.  Interactive source coding for function computation in networks , 2011 .

[2]  Te Sun Han,et al.  A dichotomy of functions F(X, Y) of correlated sources (X, Y) , 1987, IEEE Trans. Inf. Theory.

[3]  Panganamala Ramana Kumar,et al.  Computing and communicating functions over sensor networks , 2005, IEEE Journal on Selected Areas in Communications.

[4]  Alon Orlitsky,et al.  Coding for computing , 1995, Proceedings of IEEE 36th Annual Foundations of Computer Science.

[5]  P. Kumar,et al.  Towards a Theory of In-Network Computation in Wireless Sensor Networks , 2006 .

[6]  Rudolf Ahlswede,et al.  On communication complexity of vector-valued functions , 1994, IEEE Trans. Inf. Theory.

[7]  En-Hui Yang,et al.  On interactive encoding and decoding for lossless source coding with decoder only side information , 2008, 2008 IEEE International Symposium on Information Theory.

[8]  Alon Orlitsky,et al.  Worst-case interactive communication - II: Two messages are not optimal , 1991, IEEE Trans. Inf. Theory.

[9]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[10]  En-Hui Yang,et al.  Interactive Encoding and Decoding for One Way Learning: Near Lossless Recovery With Side Information at the Decoder , 2010, IEEE Transactions on Information Theory.

[11]  T. Han,et al.  A Dichotomy of Functions F ( X , Y ) o f Corre lated Sources ( X , Y ) from the Viewpoint o f the Achievable Rate Region , .

[12]  Eyal Kushilevitz,et al.  Communication Complexity , 1997, Adv. Comput..

[13]  Amiram H. Kaspi,et al.  Two-way source coding with a fidelity criterion , 1985, IEEE Trans. Inf. Theory.

[14]  Prakash Ishwar,et al.  Infinite-message distributed source coding for two-terminal interactive computing , 2009, 2009 47th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[15]  Hirosuke Yamamoto,et al.  Wyner-Ziv theory for a general function of the correlated sources , 1982, IEEE Trans. Inf. Theory.

[16]  Panganamala Ramana Kumar,et al.  Toward a theory of in-network computation in wireless sensor networks , 2006, IEEE Communications Magazine.

[17]  János Körner,et al.  How to encode the modulo-two sum of binary sources (Corresp.) , 1979, IEEE Trans. Inf. Theory.

[18]  Alon Orlitsky,et al.  Worst-case interactive communication I: Two messages are almost optimal , 1990, IEEE Trans. Inf. Theory.

[19]  Piyush Gupta,et al.  Information-theoretic bounds for multiround function computation in collocated networks , 2009, 2009 IEEE International Symposium on Information Theory.