To Feed or Not to Feedback

We study communication over finite state channels (FSCs), where the encoder and the decoder can control the availability or the quality of noise-free feedback, which is fed back from the decoder to the encoder. Specifically, the instantaneous feedback is a function of an action taken by the encoder, an action taken by the decoder, and the channel output. Encoder and decoder actions take values from finite alphabet sets and may be subject to average cost constraints. We prove capacity results for such a setting by constructing a sequence of codes, using a simple scheme based on code tree, which generates channel input symbols along with encoder and decoder actions. We prove that the limit of this sequence exists, and provide an upper bound on the maximum achievable rate. Our upper and lower bounds coincide and hence yield the capacity for the case where the probability of initial state is positive for all states. Next, the capacity is given for indecomposable channels without intersymbol interference as the limit of normalized directed information between the input and output sequences, maximized over an appropriate set of causally conditioned distributions. As a special case of our framework, we characterize the capacity of coding on the backward link in FSCs, i.e., when the decoder sends limited-rate instantaneous coded noise-free feedback on the backward link. Finally, we propose an extension of the Blahut-Arimoto algorithm for evaluating the capacity when actions can be cost constrained and demonstrate its application in a few examples. Among these examples are those of to feed or not to feedback where the encoder takes binary actions that determine whether the current channel output will be fed back to the encoder, with a constraint on the fraction of channel outputs that are fed back.

[1]  Toby Berger,et al.  The capacity of finite-State Markov Channels With feedback , 2005, IEEE Transactions on Information Theory.

[2]  Haim H. Permuter,et al.  Source coding with a side information 'vending machine' at the decoder , 2009, ISIT.

[3]  Abbas El Gamal,et al.  Network Information Theory , 2021, 2021 IEEE 3rd International Conference on Advanced Trends in Information Theory (ATIT).

[4]  Haim H. Permuter,et al.  Extension of the Blahut-Arimoto algorithm for maximizing directed information , 2010 .

[5]  Claude E. Shannon,et al.  Channels with Side Information at the Transmitter , 1958, IBM J. Res. Dev..

[6]  Ramji Venkataramanan,et al.  Source Coding With Feed-Forward: Rate-Distortion Theorems and Error Exponents for a General Source , 2007, IEEE Transactions on Information Theory.

[7]  Kingo Kobayashi,et al.  Capacity Problem of Trapdoor Channel , 2006, GTIT-C.

[8]  Haim H. Permuter,et al.  Capacity and Coding for the Ising Channel With Feedback , 2012, IEEE Transactions on Information Theory.

[9]  Sekhar Tatikonda,et al.  Feedback capacity of finite-state machine channels , 2005, IEEE Transactions on Information Theory.

[10]  Lawrence H. Ozarow,et al.  The capacity of the white Gaussian multiple access channel with feedback , 1984, IEEE Trans. Inf. Theory.

[11]  Richard E. Blahut,et al.  Computation of channel capacity and rate-distortion functions , 1972, IEEE Trans. Inf. Theory.

[12]  Robert B. Ash,et al.  Information Theory , 2020, The SAGE International Encyclopedia of Mass Media and Society.

[13]  Haim H. Permuter,et al.  Zero-Error Feedback Capacity of Channels With State Information Via Dynamic Programming , 2010, IEEE Transactions on Information Theory.

[14]  Sekhar Tatikonda,et al.  The Capacity of Channels With Feedback , 2006, IEEE Transactions on Information Theory.

[15]  H. Marko,et al.  The Bidirectional Communication Theory - A Generalization of Information Theory , 1973, IEEE Transactions on Communications.

[16]  K. Kobayashi,et al.  An input/output recursion for the trapdoor channel , 2002, Proceedings IEEE International Symposium on Information Theory,.

[17]  Claude E. Shannon,et al.  The zero error capacity of a noisy channel , 1956, IRE Trans. Inf. Theory.

[18]  Andrea J. Goldsmith,et al.  Finite State Channels With Time-Invariant Deterministic Feedback , 2006, IEEE Transactions on Information Theory.

[19]  Haim H. Permuter,et al.  Feedback Capacity of the Compound Channel , 2007, IEEE Transactions on Information Theory.

[20]  S. Tatikonda A Markov Decision Approach to Feedback Channel Capacity , 2005, Proceedings of the 44th IEEE Conference on Decision and Control.

[21]  Haim H. Permuter,et al.  Directed Information, Causal Estimation, and Communication in Continuous Time , 2009, IEEE Transactions on Information Theory.

[22]  Haim H. Permuter,et al.  Probing Capacity , 2010, IEEE Transactions on Information Theory.

[23]  Gunter Dueck The Capacity Region of the Two-Way Channel Can Exceed the Inner Bound , 1979, Inf. Control..

[24]  Sergio Verdú,et al.  A general formula for channel capacity , 1994, IEEE Trans. Inf. Theory.

[25]  Tsachy Weissman,et al.  Capacity of Channels With Action-Dependent States , 2009, IEEE Transactions on Information Theory.

[26]  D. Blackwell,et al.  Proof of Shannon's Transmission Theorem for Finite-State Indecomposable Channels , 1958 .

[27]  Thomas M. Cover,et al.  Network Information Theory , 2001 .

[28]  Haim H. Permuter,et al.  Universal Estimation of Directed Information , 2010, IEEE Transactions on Information Theory.

[29]  James L. Massey,et al.  Conservation of mutual and directed information , 2005, Proceedings. International Symposium on Information Theory, 2005. ISIT 2005..

[30]  Haim H. Permuter,et al.  Capacity Region of the Finite-State Multiple-Access Channel With and Without Feedback , 2007, IEEE Transactions on Information Theory.

[31]  Shlomo Shamai,et al.  On the capacity of some channels with channel state information , 1999, IEEE Trans. Inf. Theory.

[32]  Rudolf Ahlswede,et al.  General Theory of Information Transfer and Combinatorics , 2006, GTIT-C.

[33]  Haim H. Permuter,et al.  Capacity of the Trapdoor Channel With Feedback , 2006, IEEE Transactions on Information Theory.

[34]  Aaron D. Wyner,et al.  The rate-distortion function for source coding with side information at the decoder , 1976, IEEE Trans. Inf. Theory.

[35]  J. Wolfowitz The rate distortion function for source coding with side information at the decoder , 1979 .

[36]  Suguru Arimoto,et al.  An algorithm for computing the capacity of arbitrary discrete memoryless channels , 1972, IEEE Trans. Inf. Theory.

[37]  Haim H. Permuter,et al.  Zero-error feedback capacity via dynamic programming , 2009, ArXiv.

[38]  D. A. Bell,et al.  Information Theory and Reliable Communication , 1969 .

[39]  Sekhar Tatikonda,et al.  Control under communication constraints , 2004, IEEE Transactions on Automatic Control.

[40]  Andrea J. Goldsmith,et al.  Capacity theorems for the finite-state broadcast channel with feedback , 2008, 2008 IEEE International Symposium on Information Theory.

[41]  Thomas M. Cover,et al.  Gaussian feedback capacity , 1989, IEEE Trans. Inf. Theory.

[42]  J. Massey CAUSALITY, FEEDBACK AND DIRECTED INFORMATION , 1990 .

[43]  Gerhard Kramer Capacity results for the discrete memoryless network , 2003, IEEE Trans. Inf. Theory.

[44]  Thomas Kailath,et al.  A coding scheme for additive noise channels with feedback-I: No bandwidth constraint , 1966, IEEE Trans. Inf. Theory.

[45]  L. Goddard Information Theory , 1962, Nature.

[46]  Young-Han Kim,et al.  Feedback Capacity of Stationary Gaussian Channels , 2006, 2006 IEEE International Symposium on Information Theory.

[47]  Gerhard Kramer,et al.  Directed information for channels with feedback , 1998 .

[48]  Tsachy Weissman,et al.  On competitive prediction and its relation to rate-distortion theory , 2003, IEEE Trans. Inf. Theory.

[49]  Haim H. Permuter,et al.  Extension of the Blahut-Arimoto algorithm for maximizing directed information , 2010, 2010 48th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[50]  Lawrence H. Ozarow,et al.  An achievable region and outer bound for the Gaussian broadcast channel with feedback , 1984, IEEE Trans. Inf. Theory.

[51]  Haim H. Permuter,et al.  Interpretations of Directed Information in Portfolio Theory, Data Compression, and Hypothesis Testing , 2009, IEEE Transactions on Information Theory.

[52]  Tobias J. Oechtering,et al.  Source and channel coding with action-dependent partially known two-sided state information , 2010, 2010 IEEE International Symposium on Information Theory.

[53]  J. Pieter M. Schalkwijk,et al.  A coding scheme for additive noise channels with feedback-II: Band-limited signals , 1966, IEEE Trans. Inf. Theory.