A theorem on the entropy of certain binary sequences and applications-II

In this, the first part of a two-part paper, we establish a theorem concerning the entropy of a certain sequence of binary random variables. In the sequel we will apply this result to the solution of three problems in multi-user communication, two of which have been open for some time. Specifically we show the following. Let X and Y be binary random n -vectors, which are the input and output, respectively, of a binary symmetric channel with "crossover" probability p_0 . Let H\{X\} and H\{ Y\} be the entropies of X and Y , respectively. Then \begin{equation} \begin{split} \frac{1}{n} H\{X\} \geq h(\alpha_0), \qquad 0 \leq \alpha_0 &\leq 1, \Rightarrow \\ \qquad \qquad \&\qquad \frac{1}{n}H\{Y\} \geq h(\alpha_0(1 - p_0) + (1 - \alpha_0)p_0) \end{split} \end{equation} where h(\lambda) = -\lambda \log \lambda - (1 - \lambda) \log(l - \lambda), 0 \leq \lambda \leq 1 .

[1]  Patrick P. Bergmans,et al.  Random coding theorem for broadcast channels with degraded components , 1973, IEEE Trans. Inf. Theory.

[2]  R. Gallager Information Theory and Reliable Communication , 1968 .

[3]  Jack K. Wolf,et al.  Noiseless coding of correlated information sources , 1973, IEEE Trans. Inf. Theory.

[4]  Robert B. Ash,et al.  Information Theory , 2020, The SAGE International Encyclopedia of Mass Media and Society.

[5]  Thomas M. Cover,et al.  Broadcast channels , 1972, IEEE Trans. Inf. Theory.