Oblivious Communication Channels and Their Capacity

Let be C={x1,...,xn} sub {0,1}n be an [n,N] binary error correcting code (not necessarily linear). Let e{0,1}n be an error vector. A codeword xepsiC is said to be disturbed by the error e if the closest codeword to xopluse is no longer x. Let Ae be the subset of codewords in C that are disturbed by e. In this work, we study the size of Ae in random codes C (i.e., codes in which each codeword is chosen uniformly and independently at random from {0,1}n ). Using recent results of Vu [Random Structures and Algorithms, vol. 20, no. 3, pp. 262-316, 2002] on the concentration of non-Lipschitz functions, we show that |Ae| is strongly concentrated for a wide range of values of N and ||e||. We apply this result in the study of communication channels we refer to as oblivious. Roughly speaking, a channel W(y|x) is said to be oblivious if the error distribution imposed by the channel is independent of the transmitted codeword x. A family of channels Psi is said to be oblivious if every member W of the family is oblivious. In this work, we define oblivious and partially oblivious families of (not necessarily memoryless) channels and analyze their capacity. When considering the capacity of a family of channels Psi, one must address the design of error correcting codes which allow communication under the uncertainty of which channel WepsiPsi is actually used. The oblivious channels we define have connections to arbitrarily varying channels with state constraints.

[1]  Imre Csiszár,et al.  Capacity and decoding rules for classes of arbitrarily varying channels , 1989, IEEE Trans. Inf. Theory.

[2]  J. Wolfowitz The coding of messages subject to chance errors , 1957 .

[3]  Richard J. Lipton,et al.  A New Approach To Information Theory , 1994, STACS.

[4]  Peter Elias,et al.  Error-correcting codes for list decoding , 1991, IEEE Trans. Inf. Theory.

[5]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[6]  Michael Langberg,et al.  Private codes or succinct random codes that are (almost) perfect , 2004, 45th Annual IEEE Symposium on Foundations of Computer Science.

[7]  Prakash Narayan,et al.  Reliable Communication Under Channel Uncertainty , 1998, IEEE Trans. Inf. Theory.

[8]  Noga Alon,et al.  The Probabilistic Method , 2015, Fundamentals of Ramsey Theory.

[9]  D. Blackwell,et al.  The Capacities of Certain Channel Classes Under Random Coding , 1960 .

[10]  Imre Csiszár,et al.  The capacity of the arbitrarily varying channel revisited: Positivity, constraints , 1988, IEEE Trans. Inf. Theory.

[11]  Joel H. Spencer,et al.  Probabilistic methods , 1985, Graphs Comb..

[12]  W. Hoeffding Probability Inequalities for sums of Bounded Random Variables , 1963 .

[13]  Robert J. McEliece,et al.  New upper bounds on the rate of a code via the Delsarte-MacWilliams inequalities , 1977, IEEE Trans. Inf. Theory.

[14]  E. Gilbert A comparison of signalling alphabets , 1952 .

[15]  O. Antoine,et al.  Theory of Error-correcting Codes , 2022 .

[16]  Van H. Vu,et al.  Concentration of non‐Lipschitz functions and applications , 2002, Random Struct. Algorithms.

[17]  Noga Alon,et al.  The Probabilistic Method, Second Edition , 2004 .

[18]  Ashutosh Sabharwal,et al.  Antenna Packing in Low-Power Systems: Communication Limits and Array Design , 2008, IEEE Transactions on Information Theory.