On Entropy and Lyapunov Exponents for Finite-State Channels

The Finite-State Markov Channel (FSMC) is a time-varying channel having states that are characterized by a finite-state Markov chain. These channels have infinite memory, which complicates their capacity analysis. We develop a new method to characterize the capacity of these channels based on Lyapunov exponents. Specifically, we show that the input, output, and conditional entropies for this channel are equivalent to the largest Lyapunov exponents for a particular class of random matrix products. We then show that the Lyapunov exponents can be expressed as expectations with respect to the stationary distributions of a class of continuous-state space Markov chains. The stationary distributions for this class of Markov chains are shown to be unique and continuous functions of the input symbol probabilities, provided that the input sequence has finite memory. These properties allow us to express mutual information and channel capacity in terms of Lyapunov exponents. We then leverage this connection between entropy and Lyapunov exponents to develop a rigorous theory for computing or approximating entropy and mutual information for finite-state channels with dependent inputs. We develop a method for directly computing entropy of finite-state channels that does not rely on simulation and establish its convergence. We also obtain a new asymptotically tight lower bound for entropy based on norms of random matrix products. In addition, we prove a new functional central limit theorem for sample entropy and apply this theorem to characterize the error in simulated estimates of entropy. Finally, we present numerical examples of mutual information computation for ISI channels and observe the capacity benefits of adding memory to the input sequence for such channels.

[1]  R. Bellman Limit theorems for non-commutative operations. I. , 1954 .

[2]  T. Lindvall Weak convergence of probability measures and random functions in the function space D [0,∞) , 1973 .

[3]  J. Kingman Subadditive Ergodic Theory , 1973 .

[4]  A. Karr Weak convergence of a sequence of Markov chains , 1975 .

[5]  N. Maigret Théorème de limite centrale fonctionnel pour une chaîne de Markov récurrente au sens de Harris et positive , 1978 .

[6]  P. Hall,et al.  Martingale Limit Theory and Its Application , 1980 .

[7]  Averill M. Law,et al.  Simulation Modeling and Analysis , 1982 .

[8]  H. Furstenberg,et al.  Random matrix products and measures on projective spaces , 1983 .

[9]  Valerie Isham,et al.  Non‐Negative Matrices and Markov Chains , 1983 .

[10]  Paul Bratley,et al.  A guide to simulation , 1983 .

[11]  P. Hall,et al.  Martingale Limit Theory and its Application. , 1984 .

[12]  Paul Bratley,et al.  A guide to simulation (2nd ed.) , 1986 .

[13]  Upendra Dave,et al.  Applied Probability and Queues , 1987 .

[14]  Joel E. Cohen,et al.  Subadditivity, generalized products of random matrices and operations research , 1988 .

[15]  A. W. Kemp,et al.  Applied Probability and Queues , 1989 .

[16]  P. Glynn A Lyapunov Bound for Solutions of Poisson's Equation , 1989 .

[17]  Israel Bar-David,et al.  Capacity and coding for the Gilbert-Elliot channels , 1989, IEEE Trans. Inf. Theory.

[18]  K. Ravishankar Power law scaling of the top Lyapunov exponent of a Product of Random Matrices , 1989 .

[19]  E. S. Key Lower bounds for the maximal Lyapunov exponent , 1990 .

[20]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[21]  Yuval Peres,et al.  Analytic dependence of Lyapunov exponents on transition probabilities , 1991 .

[22]  R. Darling The Lyapunov exponent for products of infinite-dimensional random matrices , 1991 .

[23]  Peter W. Glynn,et al.  Stationarity detection in the initial transient problem , 1992, TOMC.

[24]  Richard L. Tweedie,et al.  Markov Chains and Stochastic Stability , 1993, Communications and Control Engineering Series.

[25]  O. Nerman,et al.  Weak ergodicity and products of random matrices , 1993 .

[26]  Stephen P. Boyd,et al.  Linear Matrix Inequalities in Systems and Control Theory , 1994 .

[27]  L. Arnold,et al.  Evolutionary Formalism for Products of Positive Random Matrices , 1994 .

[28]  Gordon L. Stüber Principles of mobile communication , 1996 .

[29]  Pravin Varaiya,et al.  Capacity, mutual information, and coding for finite-state Markov channels , 1996, IEEE Trans. Inf. Theory.

[30]  Sean P. Meyn,et al.  A Liapounov bound for solutions of the Poisson equation , 1996 .

[31]  R. Atar,et al.  Lyapunov Exponents for Finite State Nonlinear Filtering , 1997 .

[32]  John N. Tsitsiklis,et al.  The Lyapunov exponent and joint spectral radius of pairs of matrices are hard—when not impossible—to compute and to approximate , 1997, Math. Control. Signals Syst..

[33]  C. SIAMJ. LYAPUNOV EXPONENTS FOR FINITE STATE NONLINEAR FILTERING , 1997 .

[34]  E. Yaz Linear Matrix Inequalities In System And Control Theory , 1998, Proceedings of the IEEE.

[35]  Persi Diaconis,et al.  Iterated Random Functions , 1999, SIAM Rev..

[36]  Laurent Mevel,et al.  Exponential Forgetting and Geometric Ergodicity in Hidden Markov Models , 2000, Math. Control. Signals Syst..

[37]  Kazuyuki Aihara,et al.  Rigorous numerical Estimation of Lyapunov exponents and Invariant Measures of Iterated Function Systems and Random Matrix Products , 2000, Int. J. Bifurc. Chaos.

[38]  Aleksandar Kavcic On the capacity of Markov sources over noisy channels , 2001, GLOBECOM'01. IEEE Global Telecommunications Conference (Cat. No.01CH37270).

[39]  Paul H. Siegel,et al.  On the achievable information rates of finite state ISI channels , 2001, GLOBECOM'01. IEEE Global Telecommunications Conference (Cat. No.01CH37270).

[40]  B. Lautrup,et al.  Products of random matrices. , 2002, Physical review. E, Statistical, nonlinear, and soft matter physics.

[41]  Neri Merhav,et al.  Hidden Markov processes , 2002, IEEE Trans. Inf. Theory.

[42]  Hans-Andrea Loeliger,et al.  Computation of Information Rates from Finite-State Source/Channel Models , 2002 .

[43]  S. Ethier,et al.  Markov Processes: Characterization and Convergence , 2005 .