Bounds on the symmetric binary cutoff rate for dispersive Gaussian channels

Bounds on the symmetric binary cutoff rate for a pulse amplitude modulated (PAM) signaling over dispersive Gaussian channels are evaluated and discussed. These easily calculable bounds can be used to estimate the reliable rate of information transmission and the error exponent behavior for binary (two-level) PAM schemes, operating through a prefiltered additive white Gaussian channel, the memory of which is long enough to make the exact evaluation of the cutoff rate formidable. The core of the bounding technique relies on a probabilistic interpretation of a fundamental theorem in matrix theory, regarding the logarithm of the largest eigenvalue of a nonnegative primitive matrix, commonly applied in large deviation problems. These bounds are calculated for some examples and their respective tightness is considered. Further potential applications of the proposed bounding technique are pointed out. >

[1]  Shlomo Shamai,et al.  Information rates for a discrete-time Gaussian channel with intersymbol interference and stationary inputs , 1991, IEEE Trans. Inf. Theory.

[2]  Ezio Biglieri The computational cutoff rate of channels having memory , 1981, IEEE Trans. Inf. Theory.

[3]  Shlomo Shamai,et al.  Upper bounds on capacity for a constrained Gaussian channel , 1989, IEEE Trans. Inf. Theory.

[4]  I. M. Jacobs,et al.  Principles of Communication Engineering , 1965 .

[5]  Masao Kasahara,et al.  Evaluation of the exponent function E(R) for channels with intersymbol interference , 1982 .

[6]  Walter Hirt Capacity and information rates of discrete-time channels with memory , 1988 .

[7]  Kees A. Schouhamer Immink Coding techniques for the noisy magnetic recording channel: a state-of-the-art report , 1989, IEEE Trans. Commun..

[8]  R. Willoughby Sparse matrices and their applications , 1972 .

[9]  A. Robert Calderbank,et al.  Binary convolutional codes with application to magnetic recording , 1986, IEEE Trans. Inf. Theory.

[10]  E. Kretzmer,et al.  Generalization of a Techinque for Binary Data Communication , 1966 .

[11]  Aaron D. Wyner,et al.  Achievable rates for a constrained Gaussian channel , 1988, IEEE Trans. Inf. Theory.

[12]  James L. Massey,et al.  Capacity of the discrete-time Gaussian channel with intersymbol interference , 1988, IEEE Trans. Inf. Theory.

[13]  Ib N. Andersen,et al.  Sample-whitened matched filters , 1973, IEEE Trans. Inf. Theory.

[14]  R. Ellis,et al.  Entropy, large deviations, and statistical mechanics , 1985 .

[15]  Shlomo Shamai,et al.  A lower bound on the cutoff rate for Gaussian channels with peak-limited inputs , 1991, IEEE Trans. Commun..

[16]  Shlomo Shamai,et al.  Information rates for magnetic recording channels with peak- and slope-limited magnetization , 1989, IEEE Trans. Inf. Theory.

[17]  Jack K. Wolf,et al.  Trellis Coding for Partial-Response Channels , 1986, IEEE Trans. Commun..

[18]  Shlomo Shamai,et al.  On the capacity penalty due to input-bandwidth restrictions with an application to rate-limited binary signaling , 1990, IEEE Trans. Inf. Theory.

[19]  Shlomo Shamai,et al.  On information transfer by envelope-constrained signals over the AWGN channel , 1988, IEEE Trans. Inf. Theory.

[20]  Lin-nan Lee On optimal soft-decision demodulation , 1976, IEEE Trans. Inf. Theory.

[21]  Adel A. M. Saleh,et al.  On the Computational Cutoff Rate, R0for the Peak-Power-Limited Gaussian Channel , 1987, IEEE Trans. Commun..

[22]  D. Slepian On maxentropic discrete stationary processes , 1972 .