Computation of Information Rates from Finite-State Source/Channel Models

It has recently become feasible to compute information rates of finite-state source/channel models with not too many states. We review such methods and demonstrate their extension to compute upper and lower bounds on the information rate of very general (non-finite-state) channels by means of finite-state approximations.

[1]  Donald M. Arnold,et al.  On finite-state information rates from channel simulations , 2002, Proceedings IEEE International Symposium on Information Theory,.

[2]  B. Leroux Maximum-likelihood estimation for hidden Markov models , 1992 .

[3]  Suguru Arimoto,et al.  An algorithm for computing the capacity of arbitrary discrete memoryless channels , 1972, IEEE Trans. Inf. Theory.

[4]  Brendan J. Frey,et al.  Factor graphs and the sum-product algorithm , 2001, IEEE Trans. Inf. Theory.

[5]  Hans-Andrea Loeliger,et al.  On the information rate of binary-input channels with memory , 2001, ICC 2001. IEEE International Conference on Communications. Conference Record (Cat. No.01CH37240).

[6]  Pascal O. Vontobel,et al.  An upper bound on the capacity of channels with memory and constraint input , 2001, Proceedings 2001 IEEE Information Theory Workshop (Cat. No.01EX494).

[7]  G. Forney,et al.  Codes on graphs: normal realizations , 2000, 2000 IEEE International Symposium on Information Theory (Cat. No.00CH37060).

[8]  Israel Bar-David,et al.  Capacity and coding for the Gilbert-Elliot channels , 1989, IEEE Trans. Inf. Theory.

[9]  Richard E. Blahut,et al.  Computation of channel capacity and rate-distortion functions , 1972, IEEE Trans. Inf. Theory.

[10]  Shlomo Shamai,et al.  The intersymbol interference channel: lower bounds on capacity and channel precoding loss , 1996, IEEE Trans. Inf. Theory.

[11]  H. Loeliger,et al.  Least Squares and Kalman Filtering on Forney Graphs , 2002 .

[12]  John Cocke,et al.  Optimal decoding of linear codes for minimizing symbol error rate (Corresp.) , 1974, IEEE Trans. Inf. Theory.

[13]  Paul H. Siegel,et al.  On the achievable information rates of finite state ISI channels , 2001, GLOBECOM'01. IEEE Global Telecommunications Conference (Cat. No.01CH37270).

[14]  P. Varaiya,et al.  Capacity, mutual information, and coding for finite-state Markov channels , 1994, Proceedings of 1994 IEEE International Symposium on Information Theory.

[15]  A. Barron THE STRONG ERGODIC THEOREM FOR DENSITIES: GENERALIZED SHANNON-MCMILLAN-BREIMAN THEOREM' , 1985 .

[16]  V. Sharma,et al.  Entropy and channel capacity in the regenerative setup with applications to Markov channels , 2001, Proceedings. 2001 IEEE International Symposium on Information Theory (IEEE Cat. No.01CH37252).

[17]  Walter Hirt Capacity and information rates of discrete-time channels with memory , 1988 .

[18]  Dieter-Michael Arnold Computing information rates of finite state models with application to magnetic recording , 2003 .

[19]  Shlomo Shamai,et al.  Information rates for a discrete-time Gaussian channel with intersymbol interference and stationary inputs , 1991, IEEE Trans. Inf. Theory.

[20]  Aleksandar Kavcic On the capacity of Markov sources over noisy channels , 2001, GLOBECOM'01. IEEE Global Telecommunications Conference (Cat. No.01CH37270).