An asymptotic expression for the information and capacity of a multidimensional channel with weak input signals
暂无分享,去创建一个
An asymptotic expression is derived for the Shannon mutual information between the input and output signals for a relatively large class of continuous alphabet memoryless channels in the case of weak input signals, when the input space is multidimensional. The authors extend a result of Ibragimov and Khas'minskii (1972) from the one-dimensional to the N-dimensional case. The asymptotic expression obtained relates the Shannon (1948) mutual information function and the Fisher information matrix. This expression is used to derive an asymptotic expression for the capacity of continuous alphabet memoryless channels with vector-valued weak input signals. This asymptotic capacity involves the largest eigenvalue of the Fisher information matrix evaluated at the zero input signal. >
[1] Andrew J. Viterbi,et al. Very Low Rate Convolutional Codes for Maximum Theoretical Performance of Spread-Spectrum Multiple-Access Channels , 1990, IEEE J. Sel. Areas Commun..
[2] Sergio Verdú,et al. On channel capacity per unit cost , 1990, IEEE Trans. Inf. Theory.
[3] C. E. SHANNON,et al. A mathematical theory of communication , 1948, MOCO.
[4] I. Ibragimov,et al. Asymptotic Behavior of Statistical Estimators in the Smooth Case. I. Study of the Likelihood Ratio , 1973 .