Capacity and Error Exponents of Stationary Point Processes under Random Additive Displacements

Consider a real-valued discrete-time stationary and ergodic stochastic process, called the noise process. For each dimension n, we can choose a stationary point process in ℝ n and a translation invariant tessellation of ℝ n . Each point is randomly displaced, with a displacement vector being a section of length n of the noise process, independent from point to point. The aim is to find a point process and a tessellation that minimizes the probability of decoding error, defined as the probability that the displaced version of the typical point does not belong to the cell of this point. We consider the Shannon regime, in which the dimension n tends to ∞, while the logarithm of the intensity of the point processes, normalized by dimension, tends to a constant. We first show that this problem exhibits a sharp threshold: if the sum of the asymptotic normalized logarithmic intensity and of the differential entropy rate of the noise process is positive, then the probability of error tends to 1 with n for all point processes and all tessellations. If it is negative then there exist point processes and tessellations for which this probability tends to 0. The error exponent function, which denotes how quickly the probability of error goes to 0 in n, is then derived using large deviations theory. If the entropy spectrum of the noise satisfies a large deviations principle, then, below the threshold, the error probability goes exponentially fast to 0 with an exponent that is given in closed form in terms of the rate function of the noise entropy spectrum. This is obtained for two classes of point processes: the Poisson process and a Matérn hard-core point process. New lower bounds on error exponents are derived from this for Shannon's additive noise channel in the high signal-to-noise ratio limit that hold for all stationary and ergodic noises with the above properties and that match the best known bounds in the white Gaussian noise case.

[1]  J. Kieffer A SIMPLE PROOF OF THE MOY-PEREZ GENERALIZATION OF THE SHANNON-MCMILLAN THEOREM , 1974 .

[2]  G. Matheron Random Sets and Integral Geometry , 1976 .

[3]  Hiroki Koga,et al.  Information-Spectrum Methods in Information Theory , 2002 .

[4]  J. García An Extension of the Contraction Principle , 2004 .

[5]  François Baccelli,et al.  A Palm theory approach to error exponents , 2008, 2008 IEEE International Symposium on Information Theory.

[6]  Thomas M. Cover,et al.  Network Information Theory , 2001 .

[7]  R. Cowan An introduction to the theory of point processes , 1978 .

[8]  Amir Dembo,et al.  Large Deviations Techniques and Applications , 1998 .

[9]  Hans-Andrea Loeliger,et al.  Averaging bounds for lattices and linear codes , 1997, IEEE Trans. Inf. Theory.

[10]  C. Shannon Probability of error for optimal codes in a Gaussian channel , 1959 .

[11]  Abbas El Gamal,et al.  Network Information Theory , 2021, 2021 IEEE 3rd International Conference on Advanced Trends in Information Theory (ATIT).

[12]  A. Dembo,et al.  Large Deviation Techniques and Applications. , 1994 .

[13]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[14]  François Baccelli,et al.  Differentiability of functionals of Poisson processes via coupling with applications to queueing theory , 1999 .

[15]  Ulrike Goldschmidt,et al.  An Introduction To The Theory Of Point Processes , 2016 .

[16]  Francis Comets,et al.  Large Deviations and Applications , 2011, International Encyclopedia of Statistical Science.

[17]  Robert M. Gray,et al.  Toeplitz and Circulant Matrices: A Review , 2005, Found. Trends Commun. Inf. Theory.

[18]  A. Barron THE STRONG ERGODIC THEOREM FOR DENSITIES: GENERALIZED SHANNON-MCMILLAN-BREIMAN THEOREM' , 1985 .

[19]  R. Tibshirani,et al.  An introduction to the bootstrap , 1993 .

[20]  Simon Litsyn,et al.  A new upper bound on the reliability function of the Gaussian channel , 2000, Proceedings of the 1999 IEEE Information Theory and Communications Workshop (Cat. No. 99EX253).

[21]  H. Vincent Poor,et al.  Channel Coding Rate in the Finite Blocklength Regime , 2010, IEEE Transactions on Information Theory.

[23]  D. A. Bell,et al.  Information Theory and Reliable Communication , 1969 .

[24]  Byoung-Seon Choi,et al.  Conditional limit theorems under Markov conditioning , 1987, IEEE Trans. Inf. Theory.

[25]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[26]  Gregory Poltyrev,et al.  On coding without restrictions for the AWGN channel , 1993, IEEE Trans. Inf. Theory.

[27]  J. Møller,et al.  Lectures on Random Voronoi Tessellations , 1994 .