The Feedback Capacity of the Binary Erasure Channel With a No-Consecutive-Ones Input Constraint

The input-constrained erasure channel with feedback is considered, where the binary input sequence contains no consecutive ones, i.e., it satisfies the (1, ∞)-RLL constraint. We derive the capacity for this setting, which can be expressed as Cε = max0≤ p≤0.5 ((1-ε)Hb(p))/(1+(1-ε)p) , where ε is the erasure probability and Hb(·) is the binary entropy function. Moreover, we prove that a priori knowledge of the erasure at the encoder does not increase the feedback capacity. The feedback capacity was calculated using an equivalent dynamic programming (DP) formulation with an optimal average-reward that is equal to the capacity. Furthermore, we obtained an optimal encoding procedure from the solution of the DP, leading to a capacity-achieving, zero-error coding scheme for our setting. DP is, thus, shown to be a tool not only for solving optimization problems, such as capacity calculation, but also for constructing optimal coding schemes. The derived capacity expression also serves as the only non-trivial upper bound known on the capacity of the input-constrained erasure channel without feedback, a problem that is still open.

[1]  Thomas Kailath,et al.  A coding scheme for additive noise channels with feedback-I: No bandwidth constraint , 1966, IEEE Trans. Inf. Theory.

[2]  Sekhar Tatikonda,et al.  On the Feedback Capacity of Power-Constrained Gaussian Noise Channels With Memory , 2007, IEEE Transactions on Information Theory.

[3]  M. K. Ghosh,et al.  Discrete-time controlled Markov processes with average cost criterion: a survey , 1993 .

[4]  Brian H. Marcus,et al.  Concavity of the Mutual Information Rate for Input-Restricted Memoryless Channels at High SNR , 2012, IEEE Transactions on Information Theory.

[5]  Brian H. Marcus,et al.  Asymptotics of Entropy Rate in Special Families of Hidden Markov Chains , 2010, IEEE Transactions on Information Theory.

[6]  Toby Berger,et al.  The capacity of finite-State Markov Channels With feedback , 2005, IEEE Transactions on Information Theory.

[7]  T. Kailath,et al.  A coding scheme for additive noise channels with feedback, Part I: No bandwith constraint , 1998 .

[8]  Hans-Andrea Loeliger,et al.  A Generalization of the Blahut–Arimoto Algorithm to Finite-State Channels , 2008, IEEE Transactions on Information Theory.

[9]  B. Marcus Constrained Systems and Coding for Recording Channels, in Handbook of Coding Theory, v. Finite-state Modulation Codes for Data Storage, Ieee , 2000 .

[10]  Jack K. Wolf,et al.  On runlength codes , 1988, IEEE Trans. Inf. Theory.

[11]  J. Massey CAUSALITY, FEEDBACK AND DIRECTED INFORMATION , 1990 .

[12]  Haim H. Permuter,et al.  Can feedback increase the capacity of the energy harvesting channel? , 2015, 2015 IEEE Information Theory Workshop (ITW).

[13]  Schouhamer Immink,et al.  Codes for mass data storage systems , 2004 .

[14]  Haim H. Permuter,et al.  Capacity of the (1, ∞)-RLL input-constrained erasure channel with feedback , 2015, 2015 IEEE Information Theory Workshop (ITW).

[15]  Gerhard Kramer Capacity results for the discrete memoryless network , 2003, IEEE Trans. Inf. Theory.

[16]  Andrea J. Goldsmith,et al.  Finite State Channels With Time-Invariant Deterministic Feedback , 2006, IEEE Transactions on Information Theory.

[17]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[18]  Y.-H. Kim,et al.  A Coding Theorem for a Class of Stationary Channels with Feedback , 2007, 2007 IEEE International Symposium on Information Theory.

[19]  Haim H. Permuter,et al.  Capacity and Coding for the Ising Channel With Feedback , 2012, IEEE Transactions on Information Theory.

[20]  Richard E. Blahut,et al.  Computation of channel capacity and rate-distortion functions , 1972, IEEE Trans. Inf. Theory.

[21]  Young Han Kim,et al.  Feedback capacity of the first-order moving average Gaussian channel , 2004, Proceedings. International Symposium on Information Theory, 2005. ISIT 2005..

[22]  Haim H. Permuter,et al.  Feedback Capacity of the Compound Channel , 2007, IEEE Transactions on Information Theory.

[23]  Sekhar Tatikonda,et al.  The Capacity of Channels With Feedback , 2006, IEEE Transactions on Information Theory.

[24]  Dimitri P. Bertsekas,et al.  Dynamic Programming and Optimal Control, Two Volume Set , 1995 .

[25]  Guangyue Han,et al.  Input-constrained erasure channels: Mutual information and capacity , 2014, 2014 IEEE International Symposium on Information Theory.

[26]  Michael Horstein,et al.  Sequential transmission using noiseless feedback , 1963, IEEE Trans. Inf. Theory.

[27]  Sekhar Tatikonda,et al.  Feedback capacity of finite-state machine channels , 2005, IEEE Transactions on Information Theory.

[28]  Sekhar Tatikonda,et al.  Control under communication constraints , 2004, IEEE Transactions on Automatic Control.

[29]  Claude E. Shannon,et al.  The zero error capacity of a noisy channel , 1956, IRE Trans. Inf. Theory.

[30]  Meir Feder,et al.  Optimal Feedback Communication Via Posterior Matching , 2009, IEEE Transactions on Information Theory.

[31]  Haim H. Permuter,et al.  Capacity of the Trapdoor Channel With Feedback , 2006, IEEE Transactions on Information Theory.

[32]  Suguru Arimoto,et al.  An algorithm for computing the capacity of arbitrary discrete memoryless channels , 1972, IEEE Trans. Inf. Theory.

[33]  Elza Erkip,et al.  Constrained Codes for Joint Energy and Information Transfer , 2014, IEEE Transactions on Communications.

[34]  Brian H. Marcus,et al.  Asymptotics of Input-Constrained Binary Symmetric Channel Capacity , 2008, ArXiv.