Asymptotics of Input-Constrained Erasure Channel Capacity

In this paper, we examine an input-constrained erasure channel and we characterize the asymptotics of its capacity when the erasure rate is low. More specifically, for a general memoryless erasure channel with its input supported on an irreducible finite-type constraint, we derive partial asymptotics of its capacity, using some series expansion type formula of its mutual information rate; and for a binary erasure channel with its first-order Markovian input supported on the $(1, \infty )$ -RLL constraint based on the concavity of its mutual information rate with respect to some parameterization of the input, we numerically evaluate its first-order Markov capacity and further derive its full asymptotics. The asymptotics obtained in this paper, when compared with the recently derived feedback capacity for a binary erasure channel with the same input constraint, enable us to draw the conclusion that feedback may increase the capacity of an input-constrained channel, even if the channel is memoryless.

[1]  Claude E. Shannon,et al.  The zero error capacity of a noisy channel , 1956, IRE Trans. Inf. Theory.

[2]  Jaroslav Kožešnk,et al.  Information Theory, Statistical Decision Functions, Random Processes , 1962 .

[3]  W. Parry Intrinsic Markov chains , 1964 .

[4]  R. Gallager Information Theory and Reliable Communication , 1968 .

[5]  Hisashi Kobayashi,et al.  Application of partial-response channel coding to magnetic recording systems , 1970 .

[6]  Suguru Arimoto,et al.  An algorithm for computing the capacity of arbitrary discrete memoryless channels , 1972, IEEE Trans. Inf. Theory.

[7]  G. David Forney,et al.  Maximum-likelihood sequence estimation of digital sequences in the presence of intersymbol interference , 1972, IEEE Trans. Inf. Theory.

[8]  Richard E. Blahut,et al.  Computation of channel capacity and rate-distortion functions , 1972, IEEE Trans. Inf. Theory.

[9]  E. Seneta Non-negative Matrices and Markov Chains (Springer Series in Statistics) , 1981 .

[10]  Jack K. Wolf,et al.  On runlength codes , 1988, IEEE Trans. Inf. Theory.

[11]  James L. Massey,et al.  Capacity of the discrete-time Gaussian channel with intersymbol interference , 1988, IEEE Trans. Inf. Theory.

[12]  Israel Bar-David,et al.  Capacity and coding for the Gilbert-Elliot channels , 1989, IEEE Trans. Inf. Theory.

[13]  Amir Dembo On Gaussian feedback capacity , 1989, IEEE Trans. Inf. Theory.

[14]  R. Gray Entropy and Information Theory , 1990, Springer New York.

[15]  R. Durrett Probability: Theory and Examples , 1993 .

[16]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[17]  A. Patapoutian,et al.  The (d,k) Subcode Of A Linear Block Code , 1991, Proceedings. 1991 IEEE International Symposium on Information Theory.

[18]  Douglas Lind,et al.  An Introduction to Symbolic Dynamics and Coding , 1995 .

[19]  Pravin Varaiya,et al.  Capacity, mutual information, and coding for finite-state Markov channels , 1996, IEEE Trans. Inf. Theory.

[20]  G. Constantine,et al.  A Multivariate Faa di Bruno Formula with Applications , 1996 .

[21]  B. Marcus Finite-state Modulation Codes for Data Storage, Ieee , 2007 .

[22]  Hans-Andrea Loeliger,et al.  On the information rate of binary-input channels with memory , 2001, ICC 2001. IEEE International Conference on Communications. Conference Record (Cat. No.01CH37240).

[23]  Pascal O. Vontobel,et al.  An upper bound on the capacity of channels with memory and constraint input , 2001, Proceedings 2001 IEEE Information Theory Workshop (Cat. No.01EX494).

[24]  Aleksandar Kavcic On the capacity of Markov sources over noisy channels , 2001, GLOBECOM'01. IEEE Global Telecommunications Conference (Cat. No.01CH37270).

[25]  Paul H. Siegel,et al.  On the achievable information rates of finite state ISI channels , 2001, GLOBECOM'01. IEEE Global Telecommunications Conference (Cat. No.01CH37270).

[26]  Joseph L. Taylor Several Complex Variables with Connections to Algebraic Geometry and Lie Groups , 2002 .

[27]  Harold R. Parks,et al.  The Implicit Function Theorem , 2002 .

[28]  Neri Merhav,et al.  Hidden Markov processes , 2002, IEEE Trans. Inf. Theory.

[29]  Philippe Jacquet,et al.  On the entropy of a hidden Markov process , 2004, Data Compression Conference, 2004. Proceedings. DCC 2004.

[30]  Paul H. Siegel,et al.  Markov Processes Asymptotically Achieve the Capacity of Finite-State Intersymbol Interference Channels , 2004, IEEE Transactions on Information Theory.

[31]  Paul H. Siegel,et al.  Markov processes asymptotically achieve the capacity of finite-state intersymbol interference channels , 2004, ISIT.

[32]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[33]  Brian H. Marcus,et al.  Analyticity of Entropy Rate of Hidden Markov Chains , 2005, IEEE Transactions on Information Theory.

[34]  Wei Zeng,et al.  Simulation-Based Computation of Information Rates for Channels With Memory , 2006, IEEE Transactions on Information Theory.

[35]  Sekhar Tatikonda,et al.  GEN03-1: Feedback Capacity of Stationary Sources over Gaussian Intersymbol Interference Channels , 2006, IEEE Globecom 2006.

[36]  Brian H. Marcus,et al.  Derivatives of Entropy Rate in Special Families of Hidden Markov Chains , 2007, IEEE Transactions on Information Theory.

[37]  E. Seneta Non-negative Matrices and Markov Chains , 2008 .

[38]  Hans-Andrea Loeliger,et al.  A Generalization of the Blahut–Arimoto Algorithm to Finite-State Channels , 2008, IEEE Transactions on Information Theory.

[39]  Philippe Jacquet,et al.  On the entropy of a hidden Markov process , 2008, Theor. Comput. Sci..

[40]  Tsachy Weissman,et al.  The Information Lost in Erasures , 2008, IEEE Transactions on Information Theory.

[41]  Brian H. Marcus,et al.  Asymptotics of Input-Constrained Binary Symmetric Channel Capacity , 2008, ArXiv.

[42]  Sekhar Tatikonda,et al.  The Capacity of Channels With Feedback , 2006, IEEE Transactions on Information Theory.

[43]  Philippe Jacquet,et al.  Noisy Constrained Capacity for BSC Channels , 2010, IEEE Transactions on Information Theory.

[44]  Brian H. Marcus,et al.  Asymptotics of Entropy Rate in Special Families of Hidden Markov Chains , 2010, IEEE Transactions on Information Theory.

[45]  Henry D. Pfister,et al.  The Capacity of Finite-State Channels in the High-Noise Regime , 2010, ArXiv.

[46]  Brian H. Marcus,et al.  Concavity of the Mutual Information Rate for Input-Restricted Memoryless Channels at High SNR , 2012, IEEE Transactions on Information Theory.

[47]  Guangyue Han,et al.  Concavity of mutual information rate of finite-state channels , 2013, 2013 IEEE International Symposium on Information Theory.

[48]  Guangyue Han,et al.  Input-constrained erasure channels: Mutual information and capacity , 2014, 2014 IEEE International Symposium on Information Theory.

[49]  Haim H. Permuter,et al.  Capacity of the (1,infinity)-RLL Input-Constrained Erasure Channel with Feedback , 2015 .

[50]  Guangyue Han,et al.  A Randomized Algorithm for the Capacity of Finite-State Channels , 2015, IEEE Transactions on Information Theory.

[51]  Haim H. Permuter,et al.  The feedback capacity of the binary symmetric channel with a no-consecutive-ones input constraint , 2015, 2015 53rd Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[52]  Haim H. Permuter,et al.  The Feedback Capacity of the (1, ∞)-RLL Input-Constrained Erasure Channel , 2015, ArXiv.

[53]  Haim H. Permuter,et al.  The Feedback Capacity of the Binary Erasure Channel With a No-Consecutive-Ones Input Constraint , 2016, IEEE Transactions on Information Theory.

[54]  Andrew Thangaraj Dual capacity upper bounds for noisy runlength constrained channels , 2016, 2016 IEEE Information Theory Workshop (ITW).