Some Remarks on the Capacity of Compound Channels in the Semicontinuous Case

Since we use the s tandard terminology of coding and information theory as it can be found in Feinstein (1958) or Wolfowitz (1961) we shall be brief in describing the setup. Consider a situation where a sender can t ransmit n symbols over a (noisy) channel s. The symbols are to be chosen from an input alphabet which is assumed to be the set [1, 2, . -. , a} for all channels under consideration. The channel s may be any one f rom a given set S and remains the same for all n letters (this is the meaning of the te rm compound channel, this name being introduced in Wolfowitz, 1961). The choice of the transmission channel cannot be influenced by sender or receiver but in some circumstances (el. Section I I I ) may be known to one or bo th of them. The symbols received by the receiver belong to an output alphabet which may depend on s but which (by definition of the te rm semieontinuous) m a y be infinite. In order to make life easier we assume the output alphabet to be the set of integers for all s e S. Theorems 1, 4, and the first par t of Theorem 3, however, carry over without difficulty to the more general setup described b y Feinstein (1958) and Wolfowitz (1961) where the output alphabet belongs to any space with a given Borel field. If a sequence u = ( i l , • • • , in) of n letters is t ransmit ted, the received sequence of n letters, say, v ( u ) = ( Y l ( u ) , . . . , Y n ( u ) ) is a random variable. We assume the channels in S to be stationary, memoryless, and without anticipation, i.e., there exist channel probabil i ty functions