Easily Computed Lower Bounds on the Information Rate of Intersymbol Interference Channels

Provable lower bounds are presented for the information rate I(X; X+S+N) where X is the symbol drawn independently and uniformly from a finite-size alphabet, S is a discrete-valued random variable (RV) and N is a Gaussian RV. It is well known that with S representing the precursor intersymbol interference (ISI) at the decision feedback equalizer (DFE) output, I(X; X+S+N) serves as a tight lower bound for the symmetric information rate (SIR) as well as capacity of the ISI channel corrupted by Gaussian noise. When evaluated on a number of well-known finite-ISI channels, these new bounds provide a very similar level of tightness against the SIR to the conjectured lower bound by Shamai and Laroia at all signal-to-noise ratio (SNR) ranges, while being actually tighter when viewed closed up at high SNRs. The new lower bounds are obtained in two steps: First, a “mismatched” mutual information function is introduced which can be proved as a lower bound to I(X; X+S+N). Secondly, this function is further bounded from below by an expression that can be computed easily via a few single-dimensional integrations with a small computational load.

[1]  Shlomo Shamai,et al.  Information rates for a discrete-time Gaussian channel with intersymbol interference and stationary inputs , 1991, IEEE Trans. Inf. Theory.

[2]  D. A. Bell,et al.  Information Theory and Reliable Communication , 1969 .

[3]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[4]  Paul H. Siegel,et al.  On the achievable information rates of finite state ISI channels , 2001, GLOBECOM'01. IEEE Global Telecommunications Conference (Cat. No.01CH37270).

[5]  Hans-Andrea Loeliger,et al.  On the information rate of binary-input channels with memory , 2001, ICC 2001. IEEE International Conference on Communications. Conference Record (Cat. No.01CH37240).

[6]  Wei Zeng,et al.  Simulation-Based Computation of Information Rates for Channels With Memory , 2006, IEEE Transactions on Information Theory.

[7]  Shlomo Shamai,et al.  The intersymbol interference channel: lower bounds on capacity and channel precoding loss , 1996, IEEE Trans. Inf. Theory.

[8]  Dario Fertonani,et al.  Lower bounds on the information rate of intersymbol interference channels based on the Ungerboeck observation model , 2009, 2009 IEEE International Symposium on Information Theory.

[9]  Parastoo Sadeghi,et al.  Optimization of Information Rate Upper and Lower Bounds for Channels With Memory , 2007, IEEE Transactions on Information Theory.

[10]  V. Sharma,et al.  Entropy and channel capacity in the regenerative setup with applications to Markov channels , 2001, Proceedings. 2001 IEEE International Symposium on Information Theory (IEEE Cat. No.01CH37252).

[11]  Walter Hirt Capacity and information rates of discrete-time channels with memory , 1988 .

[12]  L. Lorne Campbell,et al.  Infinite series of interference variables with Cantor-type distributions , 1988, IEEE Trans. Inf. Theory.

[13]  G. Ungerboeck,et al.  Adaptive Maximum-Likelihood Receiver for Carrier-Modulated Data-Transmission Systems , 1974, IEEE Trans. Commun..

[14]  D. Messerschmitt A Geometric Theory of Intersymbol Interference , 1973 .

[15]  A. Garsia Entropy and singularity of infinite convolutions. , 1963 .

[16]  John Cocke,et al.  Optimal decoding of linear codes for minimizing symbol error rate (Corresp.) , 1974, IEEE Trans. Inf. Theory.

[17]  G. David Forney,et al.  Modulation and Coding for Linear Gaussian Channels , 1998, IEEE Trans. Inf. Theory.

[18]  Shlomo Shamai,et al.  Worst-case power-constrained noise for binary-input channels , 1992, IEEE Trans. Inf. Theory.

[19]  John M. Cioffi,et al.  MMSE decision-feedback equalizers and coding. I. Equalization results , 1995, IEEE Trans. Commun..