On the achievable information rates of finite-state input two-dimensional channels with memory

The achievable information rate of finite-state input two-dimensional (2-D) channels with memory is an open problem, which is relevant, e.g., for inter-symbol-interference (ISI) channels and cellular multiple-access channels. We propose a method for simulation-based computation of such information rates. We first draw a connection between the Shannon-theoretic information rate and the statistical mechanics notion of free energy. Since the free energy of such systems is intractable, we approximate it using the cluster variation method, implemented via generalized belief propagation. The derived, fully tractable, algorithm is shown to provide a practically accurate estimate of the information rate. In our experimental study we calculate the information rates of 2-D ISI channels and of hexagonal Wyner cellular networks with binary inputs, for which formerly only bounds were known

[1]  Paul H. Siegel,et al.  On the symmetric information rate of two-dimensional finite state ISI channels , 2003, Proceedings 2003 IEEE Information Theory Workshop (Cat. No.03EX674).

[2]  M. Mézard,et al.  Spin Glass Theory and Beyond , 1987 .

[3]  Walter Hirt Capacity and information rates of discrete-time channels with memory , 1988 .

[4]  Wei Zeng,et al.  Simulation-Based Computation of Information Rates for Channels With Memory , 2006, IEEE Transactions on Information Theory.

[5]  Paul H. Siegel,et al.  Markov Processes Asymptotically Achieve the Capacity of Finite-State Intersymbol Interference Channels , 2004, IEEE Transactions on Information Theory.

[6]  Toshiyuki Tanaka,et al.  A statistical-mechanics approach to large-system analysis of CDMA multiuser detectors , 2002, IEEE Trans. Inf. Theory.

[7]  William T. Freeman,et al.  Constructing free-energy approximations and generalized belief propagation algorithms , 2005, IEEE Transactions on Information Theory.

[8]  Aaron D. Wyner,et al.  Shannon-theoretic approach to a Gaussian cellular multiple-access channel , 1994, IEEE Trans. Inf. Theory.

[9]  Aleksandar Kavcic On the capacity of Markov sources over noisy channels , 2001, GLOBECOM'01. IEEE Global Telecommunications Conference (Cat. No.01CH37270).

[10]  Aleksandar Kavcic,et al.  Markov sources achieve the feedback capacity of finite-state machine channels , 2002, Proceedings IEEE International Symposium on Information Theory,.

[11]  John Cocke,et al.  Optimal decoding of linear codes for minimizing symbol error rate (Corresp.) , 1974, IEEE Trans. Inf. Theory.

[12]  Anthony J. Weiss,et al.  Generalized belief propagation receiver for near-optimal detection of two-dimensional channels with memory , 2004, Information Theory Workshop.

[13]  B. Leroux Maximum-likelihood estimation for hidden Markov models , 1992 .

[14]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[15]  Shlomo Shamai,et al.  The intersymbol interference channel: lower bounds on capacity and channel precoding loss , 1996, IEEE Trans. Inf. Theory.

[16]  Paul H. Siegel,et al.  Markov processes asymptotically achieve the capacity of finite-state intersymbol interference channels , 2004, ISIT.

[17]  Shlomo Shamai,et al.  Information rates for a discrete-time Gaussian channel with intersymbol interference and stationary inputs , 1991, IEEE Trans. Inf. Theory.