Entropy and channel capacity in the regenerative setup with applications to Markov channels

We obtain new entropy and mutual information formulae for regenerative stochastic processes. We use them on Markov channels to generalize the results in Goldsmith and Varaiya (1996). Also we obtain tighter bounds on capacity and better algorithms than in Goldsmith and Varaiya.

[1]  A. Gut Stopped Random Walks: Limit Theorems and Applications , 1987 .

[2]  Sلأren Asmussen,et al.  Applied Probability and Queues , 1989 .

[3]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[4]  P. Varaiya,et al.  Capacity, mutual information, and coding for finite-state Markov channels , 1994, Proceedings of 1994 IEEE International Symposium on Information Theory.

[5]  Prakash Narayan,et al.  Reliable Communication Under Channel Uncertainty , 1998, IEEE Trans. Inf. Theory.

[6]  Shlomo Shamai,et al.  Fading Channels: Information-Theoretic and Communication Aspects , 1998, IEEE Trans. Inf. Theory.