On the Distribution of the Information Density of Gaussian Random Vectors: Explicit Formulas and Tight Approximations

Based on the canonical correlation analysis we derive series representations of the probability density function (PDF) and the cumulative distribution function (CDF) of the information density of arbitrary Gaussian random vectors as well as a general formula to calculate the central moments. Using the general results we give closed-form expressions of the PDF and CDF and explicit formulas of the central moments for important special cases. Furthermore, we derive recurrence formulas and tight approximations of the general series representations, which allow very efficient numerical calculations with an arbitrarily high accuracy as demonstrated with an implementation in PYTHON publicly available on GITLAB. Finally, we discuss the (in)validity of Gaussian approximations of the information density. Index Terms information density, information spectrum, probability density function, cumulative distribution function, central moments, Gaussian random vector, canonical correlation analysis ∗∗ Funded in part by the German Research Foundation (DFG, Deutsche Forschungsgemeinschaft) as part of Germany’s Excellence Strategy – EXC 2050/1 – Project ID 390696704 – Cluster of Excellence "Centre for Tactile Internet with Human-inthe-Loop" (CeTI) of Technische Universität Dresden. ar X iv :2 10 5. 03 92 5v 3 [ cs .I T ] 4 N ov 2 02 1 HUFFMANN, MITTELBACH: ON THE DISTRIBUTION OF THE INFORMATION DENSITY OF GAUSSIAN RANDOM VECTORS 2

[1]  I. A. Ibragimov,et al.  On the Connection between Two Characteristics of Dependence of Gaussian Random Vectors , 1970 .

[2]  R. Dobrushin Mathematical Problems in the Shannon Theory of Optimal Coding of Information , 1961 .

[3]  H. Vincent Poor,et al.  Channel Coding Rate in the Finite Blocklength Regime , 2010, IEEE Transactions on Information Theory.

[4]  Petar Popovski,et al.  Towards Massive, Ultra-Reliable, and Low-Latency Wireless Communication with Short Packets , 2015 .

[5]  A. M. Mathai Storage capacity of a dam with gamma type inputs , 1982 .

[6]  L. Comtet,et al.  Advanced Combinatorics: The Art of Finite and Infinite Expansions , 1974 .

[7]  Matthew C. Valenti,et al.  The information-outage probability of finite-length codes over AWGN channels , 2008, 2008 42nd Annual Conference on Information Sciences and Systems.

[8]  Herbert Solomon,et al.  Distribution of Quadratic Forms and Some Applications , 1955 .

[9]  Marcus Hutter,et al.  Distribution of Mutual Information , 2001, NIPS.

[10]  Charles E. Heckler,et al.  Applied Multivariate Statistical Analysis , 2005, Technometrics.

[11]  Nihar Jindal,et al.  Coding versus ARQ in Fading Channels: How Reliable Should the PHY Be? , 2009, IEEE Transactions on Communications.

[12]  A. Prudnikov,et al.  Integrals and series of special functions , 1983 .

[13]  N. L. Johnson,et al.  SERIES REPRESENTATIONS OF DISTRIBUTIONS OF QUADRATIC FORMS IN NORMAL VARIABLES, I. CENTRAL CASE, , 1967 .

[14]  P. Moschopoulos,et al.  The distribution of the sum of independent gamma random variables , 1985 .

[15]  H. Hotelling Relations Between Two Sets of Variates , 1936 .

[16]  C. Shannon Probability of error for optimal codes in a Gaussian channel , 1959 .

[17]  Hiroki Koga,et al.  Information-Spectrum Methods in Information Theory , 2002 .

[18]  David E. Booth,et al.  Applied Multivariate Analysis , 2003, Technometrics.

[19]  Amiel Feinstein,et al.  Information and information stability of random variables and processes , 1964 .

[20]  Ravindra Khattree,et al.  Analysis of Multivariate and High‐Dimensional Data , 2015 .