Lower Bounds and Approximations for the Information Rate of the ISI Channel

We consider the discrete-time intersymbol interference (ISI) channel model, with additive Gaussian noise and fixed independent identically distributed inputs. In this setting, we investigate the expression put forth by Shamai and Laroia as a conjectured lower bound for the input-output mutual information after application of a minimum mean-square error decision-feedback equalizer receiver. A low-signal to noise ratio (SNR) expansion is used to prove that the conjectured bound does not hold under general conditions, and to characterize inputs for which it is particularly ill-suited. One such input is used to construct a counterexample, indicating that the Shamai-Laroia expression does not always bound even the achievable rate of the channel, thus excluding a natural relaxation of the original conjectured bound. However, this relaxed bound is then shown to hold for any finite entropy input and ISI channel, when the SNR is sufficiently high. We derive two conditions under which the relaxed bound holds, involving compound channel capacity and quasiconvexity arguments. Finally, new simple bounds for the achievable rate are proven, and compared with other known bounds. Information-estimation relations and estimation-theoretic bounds play a key role in establishing our results.

[1]  A. D. Wyner Upper bound on error probability for detection with unbounded intersymbol interference , 1975, The Bell System Technical Journal.

[2]  Shlomo Shamai,et al.  The intersymbol interference channel: lower bounds on capacity and channel precoding loss , 1996, IEEE Trans. Inf. Theory.

[3]  Jaekyun Moon,et al.  Easily Computed Lower Bounds on the Information Rate of Intersymbol Interference Channels , 2011, IEEE Transactions on Information Theory.

[4]  Dario Fertonani,et al.  Lower bounds on the information rate of intersymbol interference channels based on the Ungerboeck observation model , 2009, 2009 IEEE International Symposium on Information Theory.

[5]  Antonia Maria Tulino,et al.  Optimum power allocation for parallel Gaussian channels with arbitrary input distributions , 2006, IEEE Transactions on Information Theory.

[6]  Parastoo Sadeghi,et al.  Optimization of Information Rate Upper and Lower Bounds for Channels With Memory , 2007, IEEE Transactions on Information Theory.

[7]  David G. Messerschmitt,et al.  DIGITAL COMMUNICATIONS , 2013 .

[8]  Hans-Andrea Loeliger,et al.  On the information rate of binary-input channels with memory , 2001, ICC 2001. IEEE International Conference on Communications. Conference Record (Cat. No.01CH37240).

[9]  Robert M. Gray,et al.  Toeplitz and Circulant Matrices: A Review , 2005, Found. Trends Commun. Inf. Theory.

[10]  Christoph Meinel,et al.  Digital Communication , 2014, X.media.publishing.

[11]  V. Erceg,et al.  TGn Channel Models , 2004 .

[12]  Shlomo Shamai,et al.  Comparison of the Achievable Rates in OFDM and Single Carrier Modulation with I.I.D. Inputs , 2013, IEEE Transactions on Information Theory.

[13]  Stephen P. Boyd,et al.  Convex Optimization , 2004, Algorithms and Theory of Computation Handbook.

[14]  Emre Telatar,et al.  The Compound Channel Capacity of a Class of Finite-State Channels , 1998, IEEE Trans. Inf. Theory.

[15]  G. David Forney,et al.  Maximum-likelihood sequence estimation of digital sequences in the presence of intersymbol interference , 1972, IEEE Trans. Inf. Theory.

[16]  R. Gray Entropy and Information Theory , 1990, Springer New York.

[17]  Lizhong Zheng,et al.  A Coordinate System for Gaussian Networks , 2010, IEEE Transactions on Information Theory.

[18]  Sergio Verdú,et al.  Maximum likelihood sequence detection for intersymbol interference channels: A new upper bound on error probability , 1987, IEEE Trans. Inf. Theory.

[19]  John M. Cioffi,et al.  MMSE decision-feedback equalizers and coding. I. Equalization results , 1995, IEEE Trans. Commun..

[20]  Shlomo Shamai,et al.  Information rates for a discrete-time Gaussian channel with intersymbol interference and stationary inputs , 1991, IEEE Trans. Inf. Theory.

[21]  Xiao Ma,et al.  Binary intersymbol interference channels: Gallager codes, density evolution, and code performance bounds , 2003, IEEE Transactions on Information Theory.

[22]  Paul H. Siegel,et al.  On the achievable information rates of finite state ISI channels , 2001, GLOBECOM'01. IEEE Global Telecommunications Conference (Cat. No.01CH37270).

[23]  Milica Stojanovic,et al.  Bounds on the Information Rate for Sparse Channels with Long Memory and i.u.d. Inputs , 2011, IEEE Transactions on Communications.

[24]  Shlomo Shamai,et al.  Mutual information and minimum mean-square error in Gaussian channels , 2004, IEEE Transactions on Information Theory.

[25]  Shlomo Shamai,et al.  Estimation in Gaussian Noise: Properties of the Minimum Mean-Square Error , 2010, IEEE Transactions on Information Theory.

[26]  Gerard J. Foschini Performance bound for maximum-likelihood reception of digital data , 1975, IEEE Trans. Inf. Theory.

[27]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[28]  J.E. Mazo,et al.  Digital communications , 1985, Proceedings of the IEEE.

[29]  Zhi Ding,et al.  Globally Optimal Linear Precoders for Finite Alphabet Signals Over Complex Vector Gaussian Channels , 2011, IEEE Transactions on Signal Processing.

[30]  Wei Zeng,et al.  Simulation-Based Computation of Information Rates for Channels With Memory , 2006, IEEE Transactions on Information Theory.