A Note on Mutual Information in a White Gaussian Channel with Linear Feedback
暂无分享,去创建一个
Let a message m = {m(t)} be a Gaussian process. We consider the transmission of m over a white Gaussian channel with a linear feedback. The channel output in question is given by Y ( t ) = ∫ 0 t ( m ( s ) − f ( s ) ) d s + W ( t ) , where f ( s ) = ∫ s 0 f ( s, u ) dY ( u ) is a causal linear functional of Y and the noise W ( t ) is a Brownian motion. We shall prove the following: Even if such a linear feedback is taken into account, the amount of mutual information I ( m, Y ) between m and Y never increases.
[1] M. Hitsuda. Representation of Gaussian processes equivalent to Wiener process , 1968 .
[2] Claude E. Shannon,et al. The zero error capacity of a noisy channel , 1956, IRE Trans. Inf. Theory.
[3] Jacob Ziv,et al. Mutual information of the white Gaussian channel with and without feedback , 1971, IEEE Trans. Inf. Theory.
[4] M. Hitsuda. Mutual information in Gaussian channels , 1974 .
[5] Shunsuke Ihara. Optimal coding in white Gaussian channel with feedback , 1973 .