Structured Random Codes and Sensor Network Coding Theorems

In the Shannon-theoretic analysis of joint source-channel coding problems, achievability is usually established via a two-stage approach: The sources are compressed into bits, and these bits are reliably communicated across the noisy channels. Random coding arguments are the backbone of both stages of the proof. This "separation" strategy not only establishes the optimal performance for stationary ergodic point-to-point problems, but also for a number of simple network situations, such as independent sources that are communicated with respect to separate fidelity criteria across a multiple-access channel. Beyond such simple cases, for general networks, separation-based coding is suboptimal. For instance, for a simple Gaussian sensor network, uncoded transmission is exactly optimal and performs exponentially better than a separation-based solution. In this note, we generalize this sensor network strategy by employing a lattice code. The underlying linear structure of our code is crucial to its success.

[1]  Yuval Kochman,et al.  Joint Wyner–Ziv/Dirty-Paper Coding by Modulo-Lattice Modulation , 2008, IEEE Transactions on Information Theory.

[2]  Zhen Zhang,et al.  On the CEO problem , 1994, Proceedings of 1994 IEEE International Symposium on Information Theory.

[3]  Simon Litsyn,et al.  Lattices which are good for (almost) everything , 2005, IEEE Transactions on Information Theory.

[4]  János Körner,et al.  How to encode the modulo-two sum of binary sources (Corresp.) , 1979, IEEE Trans. Inf. Theory.

[5]  Toby Berger,et al.  The CEO problem [multiterminal source coding] , 1996, IEEE Trans. Inf. Theory.

[6]  Toby Berger,et al.  An upper bound on the sum-rate distortion function and its corresponding rate allocation schemes for the CEO problem , 2004, IEEE Journal on Selected Areas in Communications.

[7]  Yasutada Oohama,et al.  The Rate-Distortion Function for the Quadratic Gaussian CEO Problem , 1998, IEEE Trans. Inf. Theory.

[8]  Anand D. Sarwate,et al.  Spatial Filtering in Sensor Networks with Computation Codes , 2007, 2007 IEEE/SP 14th Workshop on Statistical Signal Processing.

[9]  Michael Gastpar,et al.  On the capacity of wireless networks: the relay case , 2002, Proceedings.Twenty-First Annual Joint Conference of the IEEE Computer and Communications Societies.

[10]  Michael Gastpar,et al.  Power, spatio-temporal bandwidth, and distortion in large sensor networks , 2005, IEEE Journal on Selected Areas in Communications.

[11]  Michael Gastpar,et al.  Computation Over Multiple-Access Channels , 2007, IEEE Transactions on Information Theory.

[12]  Michael Gastpar,et al.  Source-Channel Communication in Sensor Networks , 2003, IPSN.

[13]  Michael Gastpar,et al.  The case for structured random codes in network capacity theorems , 2008, Eur. Trans. Telecommun..

[14]  M. Gastpar Uncoded transmission is exactly optimal for a simple Gaussian "sensor" network , 2007 .

[15]  S. Sandeep Pradhan,et al.  Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function , 2007, IEEE Transactions on Information Theory.

[16]  Yuval Kochman,et al.  Joint Wyner-Ziv / Dirty-Paper Coding by Analog Modulo-Lattice Modulation † , 2009 .

[17]  S. Sandeep Pradhan,et al.  Lattices for Distributed Source Coding: Jointly Gaussian Sources and Reconstruction of a Linear Function , 2007, AAECC.

[18]  Vinod M. Prabhakaran,et al.  Rate region of the quadratic Gaussian CEO problem , 2004, International Symposium onInformation Theory, 2004. ISIT 2004. Proceedings..

[19]  Toby Berger,et al.  The quadratic Gaussian CEO problem , 1997, IEEE Trans. Inf. Theory.