On a Markov Lemma and Typical Sequences for Polish Alphabets

In this paper, we consider a new definition of typicality based on the weak* topology that is applicable to Polish alphabets (which includes ℝn). This notion is a generalization of strong typicality in the sense that it degenerates to strong typicality in the finite alphabet case, and can also be applied to mixed and continuous distributions. Furthermore, it is strong enough to prove a Markov lemma, and thus can be used to directly prove a more general class of results than entropy (or weak) typicality. We provide two example applications of this technique. First, using the Markov Lemma, we directly prove a coding result for Gel'fand-Pinsker channels with an average input constraint for a large class of alphabets and channels without first proving a finite alphabet result and then resorting to delicate quantization arguments. This class of alphabets includes, for example, real and complex inputs subject to a peak amplitude restriction. While this large class does not directly allow for Gaussian distributions with average power constraints, it is shown to be straightforward to recover this case by considering a sequence of truncated Gaussian distributions. As a second example, we consider a problem of coordinated actions (i.e., empirical distributions) for a two node network, where we derive necessary and sufficient conditions for a given desired coordination.

[1]  Maxim Raginsky,et al.  Empirical Processes, Typical Sequences, and Coordinated Actions in Standard Borel Spaces , 2010, IEEE Transactions on Information Theory.

[2]  Amir Dembo,et al.  Large Deviations Techniques and Applications , 1998 .

[3]  Heinrich Schwarte Approaching capacity of a continuous channel by discrete input distributions , 1996, IEEE Trans. Inf. Theory.

[4]  C. Villani Optimal Transport: Old and New , 2008 .

[5]  Shlomo Shamai,et al.  Nested linear/Lattice codes for structured multiterminal binning , 2002, IEEE Trans. Inf. Theory.

[6]  Gerhard Kramer,et al.  Topics in Multi-User Information Theory , 2008, Found. Trends Commun. Inf. Theory.

[7]  Steven W. McLaughlin,et al.  Capacity analysis for continuous-alphabet channels with side information, part I: a general framework , 2005, IEEE Transactions on Information Theory.

[8]  Thomas M. Cover,et al.  Network Information Theory , 2001 .

[9]  Heinrich Schwarte On weak convergence of probability measures, channel capacity and code error probabilities , 1996, IEEE Trans. Inf. Theory.

[10]  Imre Csiszár,et al.  Arbitrarily varying channels with general alphabets and states , 1992, IEEE Trans. Inf. Theory.

[12]  Hiroki Koga,et al.  Information-Spectrum Methods in Information Theory , 2002 .

[13]  Young-Han Kim,et al.  Multiple user writing on dirty paper , 2004, International Symposium onInformation Theory, 2004. ISIT 2004. Proceedings..

[14]  Abbas El Gamal,et al.  Network Information Theory , 2021, 2021 IEEE 3rd International Conference on Advanced Trends in Information Theory (ATIT).

[15]  Robert M. Gray,et al.  Probability, Random Processes, And Ergodic Properties , 1987 .

[16]  Sergio Verdú,et al.  A general formula for channel capacity , 1994, IEEE Trans. Inf. Theory.

[17]  Amos Lapidoth,et al.  The Gaussian watermarking game , 2000, IEEE Trans. Inf. Theory.

[18]  Patrick Mitran,et al.  On compound channels with side information at the transmitter , 2006, IEEE Transactions on Information Theory.

[19]  T. Cover,et al.  Writing on colored paper , 2001, Proceedings. 2001 IEEE International Symposium on Information Theory (IEEE Cat. No.01CH37252).

[20]  John C. Kieffer,et al.  Block coding for weakly continuous channels , 1981, IEEE Trans. Inf. Theory.

[21]  Achim Klenke,et al.  Probability theory - a comprehensive course , 2008, Universitext.

[22]  Shlomo Shamai,et al.  Capacity and lattice strategies for canceling known interference , 2005, IEEE Transactions on Information Theory.

[23]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[24]  Aaron D. Wyner,et al.  On source coding with side information at the decoder , 1975, IEEE Trans. Inf. Theory.

[25]  Shunsuke Ihara,et al.  Information theory - for continuous systems , 1993 .

[26]  Haim H. Permuter,et al.  Coordination Capacity , 2009, IEEE Transactions on Information Theory.

[27]  John C. Kieffer,et al.  Sliding-block coding for weakly continuous channels , 1982, IEEE Trans. Inf. Theory.

[28]  Dudley,et al.  Real Analysis and Probability: Measurability: Borel Isomorphism and Analytic Sets , 2002 .