Gaussian distributions on Riemannian symmetric spaces, random matrices, and planar Feynman diagrams.

Gaussian distributions can be generalized from Euclidean space to a wide class of Riemannian manifolds. Gaussian distributions on manifolds are harder to make use of in applications since the normalisation factors, which we will refer to as partition functions, are complicated, intractable integrals in general that depend in a highly non-linear way on the mean of the given distribution. Nonetheless, on Riemannian symmetric spaces, the partition functions are independent of the mean and reduce to integrals over finite dimensional vector spaces. These are generally still hard to compute numerically when the dimension (more precisely the rank $N$) of the underlying symmetric space gets large. On the space of positive definite Hermitian matrices, it is possible to compute these integrals exactly using methods from random matrix theory and the so-called Stieltjes-Wigert polynomials. In other cases of interest to applications, such as the space of symmetric positive definite (SPD) matrices or the Siegel domain (related to block-Toeplitz covariance matrices), these methods seem not to work quite as well. Nonetheless, it remains possible to compute leading order terms in a large $N$ limit, which provide increasingly accurate approximations as $N$ grows. This limit is inspired by realizing a given partition function as the partition function of a zero-dimensional quantum field theory or even Chern-Simons theory. From this point of view the large $N$ limit arises naturally and saddle-point methods, Feynman diagrams, and certain universalities that relate different spaces emerge.

[1]  P. Eberlein Geometry of Nonpositively Curved Manifolds , 1997 .

[2]  Lin Zhang Volumes of orthogonal groups and unitary groups , 2015, 1509.00537.

[3]  Miguel Tierz,et al.  Riemannian Gaussian distributions, random matrix ensembles and diffusion kernels , 2020, Nuclear Physics B.

[4]  M. Mariño Chern-Simons Theory, Matrix Models, and Topological Strings , 2005 .

[5]  Maher Moakher,et al.  A rigorous framework for diffusion tensor calculus , 2005, Magnetic resonance in medicine.

[6]  Yang Chen,et al.  Density of zeros of some orthogonal polynomials , 1998 .

[7]  G. Hooft A Planar Diagram Theory for Strong Interactions , 1974 .

[8]  Brian C. Lovell,et al.  Sparse Coding on Symmetric Positive Definite Manifolds Using Bregman Divergences , 2014, IEEE Transactions on Neural Networks and Learning Systems.

[9]  Paul M. Thompson,et al.  Measuring brain variability by extrapolating sparse tensor fields measured on sulcal lines , 2007, NeuroImage.

[10]  de Ng Dick Bruijn On some multiple integrals involving determinants , 1955 .

[11]  C. Krattenthaler,et al.  Symmetry Classes , 1998 .

[12]  P. Bougerol Kalman filtering with random coefficients and contractions , 1993 .

[13]  Rene F. Swarttouw,et al.  Orthogonal polynomials , 2020, NIST Handbook of Mathematical Functions.

[14]  Christian Jutten,et al.  Transfer Learning: A Riemannian Geometry Framework With Applications to Brain–Computer Interfaces , 2018, IEEE Transactions on Biomedical Engineering.

[15]  Xavier Pennec,et al.  Intrinsic Statistics on Riemannian Manifolds: Basic Tools for Geometric Measurements , 2006, Journal of Mathematical Imaging and Vision.

[16]  Fatih Murat Porikli,et al.  Region Covariance: A Fast Descriptor for Detection and Classification , 2006, ECCV.

[17]  R. Ho Algebraic Topology , 2022 .

[18]  T. Krajewski A renormalisation group approach to the universality of Wigner's semicircle law for random matrices with dependent entries , 2017, 1710.05685.

[19]  S. Helgason Differential Geometry, Lie Groups, and Symmetric Spaces , 1978 .

[20]  C. Therrien Relations between 2-D and multichannel linear prediction , 1981 .

[21]  Arno B. J. Kuijlaars,et al.  The Asymptotic Zero Distribution of Orthogonal Polynomials with Varying Recurrence Coefficients , 1999 .

[22]  Fatih Murat Porikli,et al.  Pedestrian Detection via Classification on Riemannian Manifolds , 2008, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[23]  Sándor Beniczky,et al.  The standardized EEG electrode array of the IFCN , 2017, Clinical Neurophysiology.

[24]  Baba C. Vemuri,et al.  Gaussian Distributions on Riemannian Symmetric Spaces: Statistical Learning With Structured Covariance Matrices , 2016, IEEE Transactions on Information Theory.

[25]  M. Stephanov,et al.  Random Matrices , 2005, hep-ph/0509286.

[26]  Salem Said,et al.  Gaussian Distributions on Riemannian Symmetric Spaces in the Large N Limit , 2021, GSI.

[27]  Freeman J. Dyson,et al.  The Threefold Way. Algebraic Structure of Symmetry Groups and Ensembles in Quantum Mechanics , 1962 .

[28]  D. Auckly,et al.  Introduction to the Gopakumar-Vafa Large N Duality , 2007, math/0701568.

[29]  Christian Jutten,et al.  Multiclass Brain–Computer Interface Classification by Riemannian Geometry , 2012, IEEE Transactions on Biomedical Engineering.

[30]  P. Forrester,et al.  Classical Skew Orthogonal Polynomials and Random Matrices , 1999, solv-int/9907001.

[31]  Andreas Jakobsson,et al.  Computationally efficient two-dimensional Capon spectrum analysis , 2000, IEEE Trans. Signal Process..

[32]  Emil Grosswald,et al.  The Theory of Partitions , 1984 .

[33]  Xavier Pennec,et al.  A Riemannian Framework for Tensor Computing , 2005, International Journal of Computer Vision.

[34]  D. Le Bihan,et al.  Diffusion tensor imaging: Concepts and applications , 2001, Journal of magnetic resonance imaging : JMRI.

[35]  Rui Caseiro,et al.  A nonparametric Riemannian framework on tensor field with application to foreground segmentation , 2011, 2011 International Conference on Computer Vision.

[36]  Frederic Barbaresco,et al.  Stochastic algorithms for computing p-means of probability measures, geometry of radar Toeplitz covariance matrices and applications to HR Doppler processing , 2011, 2011 12th International Radar Symposium (IRS).

[37]  P. Forrester Global and local scaling limits for the β = 2 Stieltjes–Wigert random matrix ensemble , 2020, Random Matrices: Theory and Applications.

[38]  Elizabeth Meckes,et al.  The Random Matrix Theory of the Classical Compact Groups , 2019 .

[39]  M. Tierz SOFT MATRIX MODELS AND CHERN–SIMONS PARTITION FUNCTIONS , 2002, hep-th/0212128.

[40]  Maher Moakher On the Averaging of Symmetric Positive-Definite Tensors , 2006 .

[41]  Marc Arnaudon,et al.  Riemannian Medians and Means With Applications to Radar Signal Processing , 2013, IEEE Journal of Selected Topics in Signal Processing.

[42]  Baba C. Vemuri,et al.  A Novel Dynamic System in the Space of SPD Matrices with Applications to Appearance Tracking , 2013, SIAM J. Imaging Sci..

[43]  L. M.,et al.  A Method of Integration over Matrix Variables , 2005 .

[44]  P. Basser,et al.  MR diffusion tensor spectroscopy and imaging. , 1994, Biophysical journal.

[45]  E. Wigner Characteristic Vectors of Bordered Matrices with Infinite Dimensions I , 1955 .

[46]  Cumrun Vafa,et al.  Mirror Symmetry , 2000, hep-th/0002222.

[47]  Jonathan H. Manton,et al.  Riemannian Gaussian Distributions on the Space of Symmetric Positive Definite Matrices , 2015, IEEE Transactions on Information Theory.

[48]  Hongdong Li,et al.  Kernel Methods on the Riemannian Manifold of Symmetric Positive Definite Matrices , 2013, 2013 IEEE Conference on Computer Vision and Pattern Recognition.

[49]  Michèle Vergne,et al.  Heat Kernels and Dirac Operators: Grundlehren 298 , 1992 .

[50]  Janusz Konrad,et al.  Action Recognition From Video Using Feature Covariance Matrices , 2013, IEEE Transactions on Image Processing.

[51]  Xuelong Li,et al.  Gabor-Based Region Covariance Matrices for Face Recognition , 2008, IEEE Transactions on Circuits and Systems for Video Technology.