In the theory of identification via noisy channels randomization in the encoding has a dramatic effect on the optimal code size, namely, it grows double-exponentially in the blocklength, whereas in the theory of transmission it has the familiar exponential growth. We consider now instead of the discrete memoryless channel (DMC) more robust channels such as the familiar compound (CC) and arbitrarily varying channels (AVC). They can be viewed as models for jamming situations. We make the pessimistic assumption that the jammer knows the input sequence before he acts. This forces communicators to use the maximal error concept and also makes randomization in the encoding superfluous. Now, for a DMC W by a simple observation, made by Ahlswede and Dueck (1989), in the absence of randomization the identification capacity, say C/sub NRI/(W), equals the logarithm of the number of different row-vectors in W. We generalize this to compound channels. A formidable problem arises if the DMC W is replaced by the AVC W. In fact, for 0-1-matrices only in W we are-exactly as for transmission-led to the equivalent zero-error-capacity of Shannon. But for general W the identification capacity C/sub NRI/(W) is quite different from the transmission capacity C(W). An observation is that the separation codes of Ahlswede (1989) are also relevant here. We present a lower bound on C/sub NRI/(W). It implies for instance for W={(/sub 0 1//sup 1 0/), (/sub /spl delta/ (1-/spl delta/)(1 0)/)}, /spl delta//spl isin/(0, 1/2 ) that C/sub NRI/(W)=1, which is obviously tight. It exceeds C(W), which is known to exceed 1-h(/spl delta/), where h is the binary entropy function. We observe that a separation code, with worst case average list size L~ (which we call an NRA code) can be partitioned into L~2/sup ne/ transmission codes. This gives a nonsingle-letter characterization of the capacity of AVC with maximal probability of error in terms of the capacity of codes with list decoding. We also prove that randomization in the decoding does not increase C/sub I/(W) and C/sub NRI/(W). Finally, we draw attention to related work on source coding.
[1]
R. Ahlswede.
A Note on the Existence of the Weak Capacity for Channels with Arbitrarily Varying Channel Probability Functions and Its Relation to Shannon's Zero Error Capacity
,
1970
.
[2]
Tamás Linder,et al.
The multiple description rate region for high resolution source coding
,
1998,
Proceedings DCC '98 Data Compression Conference (Cat. No.98TB100225).
[3]
Rudolf Ahlswede,et al.
Identification via channels
,
1989,
IEEE Trans. Inf. Theory.
[4]
R. Ahlswede.
Elimination of correlation in random codes for arbitrarily varying channels
,
1978
.
[5]
Imre Csiszár,et al.
The capacity of the arbitrarily varying channel revisited: Positivity, constraints
,
1988,
IEEE Trans. Inf. Theory.
[6]
Rudolf Ahlswede,et al.
A General Theory of Information Transfer
,
1993,
Proceedings. IEEE International Symposium on Information Theory.
[7]
Claude E. Shannon,et al.
The zero error capacity of a noisy channel
,
1956,
IRE Trans. Inf. Theory.
[8]
I. Csiszár,et al.
On the capacity of the arbitrarily varying channel for maximum probability of error
,
1981
.
[9]
Rudolf Ahlswede,et al.
A method of coding and its application to arbitrarily varying channels
,
1980
.
[10]
Abbas El Gamal,et al.
Achievable rates for multiple descriptions
,
1982,
IEEE Trans. Inf. Theory.
[11]
Tamás Linder,et al.
On the asymptotic tightness of the Shannon lower bound
,
1994,
IEEE Trans. Inf. Theory.