Approximate Log-Determinant Divergences Between Covariance Operators and Applications

Covariance matrices and covariance operators have been playing increasingly important roles in numerous applications in machine learning, computer vision, image and signal processing. An active current research direction on covariance matrices and operators involves the exploitation of their intrinsic non-Euclidean geometrical structures for optimal practical performance. In this work, we consider the Log-Determinant divergences, which is a family of parametrized divergences encompassing many different divergences and distances between covariance matrices and operators, including the affine-invariant Riemannian distance and symmetric Stein divergence. In particular, we present finite-dimensional approximations of the infinite-dimensional Log-Determinant divergences between covariance operators, which consistently estimate the exact versions and at the same time can be substantially more efficient to compute. Computationally, we focus on covariance operators in reproducing kernel Hilbert spaces. For the Hellinger distance, defined using the symmetric Stein divergence, we obtain a two-layer kernel machine defined using both the mean vector and covariance operator. The theoretical formulation is accompanied by numerical experiments in computer vision.

[1]  Vittorio Murino,et al.  Characterizing Humans on Riemannian Manifolds , 2013, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[2]  Alexandre Barachant,et al.  Riemannian geometry for EEG-based brain-computer interfaces; a primer and a review , 2017 .

[3]  Vittorio Murino,et al.  Covariances in Computer Vision and Machine Learning , 2017, Covariances in Computer Vision and Machine Learning.

[4]  C. Givens,et al.  A class of Wasserstein metrics for probability distributions. , 1984 .

[5]  Ha Quang Minh,et al.  Infinite-dimensional Log-Determinant divergences between positive definite trace class operators , 2017 .

[6]  Harish Karnick,et al.  Random Feature Maps for Dot Product Kernels , 2012, AISTATS.

[7]  M. H. Quang Regularized Divergences Between Covariance Operators and Gaussian Measures on Hilbert Spaces , 2019, Journal of Theoretical Probability.

[8]  G. Larotonda Nonpositive curvature: A geometrical approach to Hilbert–Schmidt operators , 2007 .

[9]  Barry Simon,et al.  Notes on infinite determinants of Hilbert space operators , 1977 .

[10]  H. Minh,et al.  Infinite-dimensional Log-Determinant divergences between positive definite Hilbert–Schmidt operators , 2017, Positivity.

[11]  Vittorio Murino,et al.  Approximate Log-Hilbert-Schmidt Distances between Covariance Operators for Image Classification , 2016, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[12]  Maher Moakher,et al.  Means of Hermitian positive-definite matrices based on the log-determinant α-divergence function , 2012 .

[13]  Xavier Pennec,et al.  A Riemannian Framework for Tensor Computing , 2005, International Journal of Computer Vision.

[14]  Vittorio Murino,et al.  Log-Hilbert-Schmidt metric between positive definite operators on Hilbert spaces , 2014, NIPS.

[15]  Benjamin Recht,et al.  Random Features for Large-Scale Kernel Machines , 2007, NIPS.

[16]  D. Dowson,et al.  The Fréchet distance between multivariate normal distributions , 1982 .

[17]  Vittorio Murino,et al.  From Covariance Matrices to Covariance Operators: Data Representation from Finite to Infinite-Dimensional Settings , 2016 .

[18]  Nicholas Ayache,et al.  Geometric Means in a Novel Vector Space Structure on Symmetric Positive-Definite Matrices , 2007, SIAM J. Matrix Anal. Appl..

[19]  Barbara Caputo,et al.  Class-Specific Material Categorisation , 2005, ICCV.

[20]  M. Gelbrich On a Formula for the L2 Wasserstein Metric between Measures on Euclidean and Hilbert Spaces , 1990 .

[21]  Fatih Murat Porikli,et al.  Pedestrian Detection via Classification on Riemannian Manifolds , 2008, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[22]  Suvrit Sra,et al.  A new metric on the manifold of kernel matrices with application to matrix geometric means , 2012, NIPS.

[23]  H. Minh,et al.  Alpha-Beta Log-Determinant Divergences Between Positive Definite Trace Class Operators , 2019, Information Geometry.

[24]  Nicholas Ayache,et al.  Fast and Simple Calculus on Tensors in the Log-Euclidean Framework , 2005, MICCAI.

[25]  R. Bhatia Positive Definite Matrices , 2007 .

[26]  I. Dryden,et al.  Non-Euclidean statistics for covariance matrices, with applications to diffusion tensor imaging , 2009, 0910.1656.