Abstract In light of recent results by Verdu and Han on channel capacity, we examine three problems: the strong converse condition to the channel coding.theorem, the capacity of arbitrary channels with feedback and the Neyman‐Pearson hypothesis testing type‐II error exponent. It is first remarked that the strong converse condition holds if and only if the sequence of normalized channel information densities converges in probability to a constant. Examples illustrating this condition are provided. A general formula for the capacity of arbitrary channels with output feedback is then obtained. Finally, a general expression for the Neyman‐Pearson type‐II error exponent based on arbitrary observations subject to a constant bound on the type‐I error probability is derived.
[1]
R. Gray.
Entropy and Information Theory
,
1990,
Springer New York.
[2]
G. Longo.
Source Coding Theory
,
1970
.
[3]
Sergio Verdú,et al.
A general formula for channel capacity
,
1994,
IEEE Trans. Inf. Theory.
[4]
Sergio Verdú,et al.
Approximation theory of output statistics
,
1993,
IEEE Trans. Inf. Theory.
[5]
Richard E. Blahut,et al.
Principles and practice of information theory
,
1987
.
[6]
Amiel Feinstein,et al.
Information and information stability of random variables and processes
,
1964
.