A lattice compress-and-forward scheme

We present a nested lattice-code-based strategy that achieves the random-coding based Compress-and-Forward (CF) rate for the three node Gaussian relay channel. To do so, we first outline a lattice-based strategy for the (X + Z<inf>1</inf>, X + Z<inf>2</inf>) Wyner-Ziv lossy source-coding with side-information problem in Gaussian noise, a re-interpretation of the nested lattice-code-based Gaussian Wyner-Ziv scheme presented by Zamir, Shamai, and Erez. We use the notation (X + Z<inf>1</inf>, X + Z<inf>2</inf>) Wyner-Ziv to mean that the source is of the form X + Z<inf>1</inf> and the side-information at the receiver is of the form X+Z<inf>2</inf>, for independent Gaussian X, Z<inf>1</inf> and Z<inf>2</inf>. We use this (X + Z<inf>1</inf>, X + Z<inf>2</inf>) Wyner-Ziv scheme to implement a “structured” or lattice-code-based CF scheme for the Gaussian relay channel which achieves the same rate as the Cover-El Gamal CF rate achieved by random Gaussian codebooks.

[1]  R. Zamir Lattices are everywhere , 2009, 2009 Information Theory and Applications Workshop.

[2]  Uri Erez,et al.  Achieving 1/2 log (1+SNR) on the AWGN channel with lattice encoding and decoding , 2004, IEEE Transactions on Information Theory.

[3]  G. David Forney,et al.  On the role of MMSE estimation in approaching the information-theoretic limits of linear Gaussian channels: Shannon meets Wiener , 2004, ArXiv.

[4]  Michael Gastpar,et al.  Compute-and-Forward: Harnessing Interference Through Structured Codes , 2009, IEEE Transactions on Information Theory.

[5]  Shlomo Shamai,et al.  Nested linear/Lattice codes for structured multiterminal binning , 2002, IEEE Trans. Inf. Theory.

[6]  Sae-Young Chung,et al.  Noisy Network Coding , 2010, IEEE Transactions on Information Theory.

[7]  Shlomo Shamai,et al.  A layered lattice coding scheme for a class of three user Gaussian interference channels , 2008, 2008 46th Annual Allerton Conference on Communication, Control, and Computing.

[8]  Sae-Young Chung,et al.  Capacity of the Gaussian Two-Way Relay Channel to Within ${1\over 2}$ Bit , 2009, IEEE Transactions on Information Theory.

[9]  Abbas El Gamal,et al.  Capacity theorems for relay channels , 1979 .

[10]  Aaron D. Wyner,et al.  The rate-distortion function for source coding with side information at the decoder , 1976, IEEE Trans. Inf. Theory.

[11]  Alexander Sprintson,et al.  Joint Physical Layer Coding and Network Coding for Bidirectional Relaying , 2008, IEEE Transactions on Information Theory.

[12]  Abbas El Gamal,et al.  Capacity theorems for the relay channel , 1979, IEEE Trans. Inf. Theory.

[13]  Hans-Andrea Loeliger,et al.  Averaging bounds for lattices and linear codes , 1997, IEEE Trans. Inf. Theory.

[14]  Sae-Young Chung,et al.  Capacity of the Gaussian Two-way Relay Channel to within 1/2 Bit , 2009, ArXiv.

[15]  S. Sandeep Pradhan,et al.  A proof of the existence of good nested lattices , 2007 .

[16]  Natasha Devroye,et al.  List decoding for nested lattices and applications to relay channels , 2010, 2010 48th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[17]  Sae-Young Chung,et al.  Nested Lattice Codes for Gaussian Relay Networks With Interference , 2011, IEEE Transactions on Information Theory.

[18]  Abbas El Gamal,et al.  Lecture Notes on Network Information Theory , 2010, ArXiv.

[19]  Simon Litsyn,et al.  Lattices which are good for (almost) everything , 2005, IEEE Transactions on Information Theory.