Oblivious channels

Let C = {X1,...,XN} sub {0, 1}n be an [n, N] binary error correcting code (not necessarily linear). Let e isin {0, 1}n be an error vector. A codeword X isin C is said to be disturbed by the error e if the closest codeword to X oplus e is no longer X. Let Ae be the subset of codewords in C that are disturbed by e. In this work we study the size of Ae in random codes C (i.e. codes in which each codeword Xi is chosen uniformly and independently at random from {0, 1}n). Using recent results of Vu [random structures and algorithms 20(3)] on the concentration of non-Lipschitz functions, we show that |Ae| is strongly concentrated for a wide range of values of N and parepar. We apply this result in the study of communication channels we refer to as oblivious. Roughly speaking, a channel W(y|x) is said to be oblivious if the error distribution imposed by the channel is independent of the transmitted codeword x. For example, the well studied binary symmetric channel is an oblivious channel. In this work, we define oblivious and partially oblivious channels and present lower bounds on their capacity. The oblivious channels we define have connections to arbitrarily varying channels with state constraints

[1]  Imre Csiszár,et al.  Capacity and decoding rules for classes of arbitrarily varying channels , 1989, IEEE Trans. Inf. Theory.

[2]  Michael Langberg,et al.  Private codes or succinct random codes that are (almost) perfect , 2004, 45th Annual IEEE Symposium on Foundations of Computer Science.

[3]  Prakash Narayan,et al.  Reliable Communication Under Channel Uncertainty , 1998, IEEE Trans. Inf. Theory.

[4]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[5]  Joel H. Spencer,et al.  Probabilistic methods , 1985, Graphs Comb..

[6]  N. Sloane,et al.  Lower Bounds to Error Probability for Coding on Discrete Memoryless Channels. I , 1993 .

[7]  Richard J. Lipton,et al.  A New Approach To Information Theory , 1994, STACS.

[8]  Imre Csiszár,et al.  The capacity of the arbitrarily varying channel revisited: Positivity, constraints , 1988, IEEE Trans. Inf. Theory.

[9]  Neil J. A. Sloane,et al.  The theory of error-correcting codes (north-holland , 1977 .

[10]  Peter Elias,et al.  Error-correcting codes for list decoding , 1991, IEEE Trans. Inf. Theory.

[11]  Robert J. McEliece,et al.  New upper bounds on the rate of a code via the Delsarte-MacWilliams inequalities , 1977, IEEE Trans. Inf. Theory.

[12]  E. Gilbert A comparison of signalling alphabets , 1952 .

[13]  O. Antoine,et al.  Theory of Error-correcting Codes , 2022 .

[14]  W. Hoeffding Probability Inequalities for sums of Bounded Random Variables , 1963 .

[15]  D. Blackwell,et al.  The Capacities of Certain Channel Classes Under Random Coding , 1960 .

[16]  Van H. Vu,et al.  Concentration of non‐Lipschitz functions and applications , 2002, Random Struct. Algorithms.