A note on Onicescu's informational energy and correlation coefficient in exponential families

The informational energy of Onicescu is a positive quantity that measures the amount of uncertainty of a random variable. However, contrary to Shannon’s entropy, the informational energy is strictly convex and increases when randomness decreases. We report a closed-form formula for Onicescu’s informational energy and its associated correlation coefficient when the probability distributions belong to an exponential family. We show how to instantiate the generic formula for several common exponential families. Finally, we discuss the characterization of valid thermodynamic process trajectories on a statistical manifold by enforcing that the entropy and the informational energy shall vary in opposite directions.

[1]  F. Nielsen The Many Faces of Information Geometry , 2022, Notices of the American Mathematical Society.

[2]  R. Nock,et al.  Cumulant-free closed-form formulas for some common (dis)similarities between densities of an exponential family , 2020, ArXiv.

[3]  Yew Kam Ho,et al.  Shannon, Rényi, Tsallis Entropies and Onicescu Information Energy for Low-Lying Singly Excited States of Helium , 2019, Atoms.

[4]  A. Dechant,et al.  Stochastic Time Evolution, Information Geometry, and the Cramér-Rao Bound , 2018, Physical Review X.

[5]  Frank Nielsen,et al.  On Hölder Projective Divergences , 2017, Entropy.

[6]  Maricel Agop,et al.  Implications of Onicescu's informational energy in some fundamental physical models , 2015 .

[7]  C. Udriste,et al.  Geometric Modeling in Probability and Statistics , 2014 .

[8]  Vasile Avram,et al.  Using Onicescu's Informational Energy to Approximate Social Entropy☆ , 2014 .

[9]  Frank Nielsen,et al.  On the chi square and higher-order chi distances for approximating f-divergences , 2013, IEEE Signal Processing Letters.

[10]  R. Renner,et al.  Generalized Entropies , 2012, 1211.3141.

[11]  Frank Nielsen,et al.  Closed-form information-theoretic divergences for statistical mixtures , 2012, Proceedings of the 21st International Conference on Pattern Recognition (ICPR2012).

[12]  F. Szidarovszky,et al.  Some notes on applying the Herfindahl–Hirschman Index , 2012 .

[13]  Mojtaba Alipour,et al.  Onicescu information energy in terms of Shannon entropy and Fisher information densities , 2012 .

[14]  Frank Nielsen,et al.  A closed-form expression for the Sharma–Mittal entropy of exponential families , 2011, ArXiv.

[15]  José Carlos Príncipe,et al.  Closed-form cauchy-schwarz PDF divergence for mixture of Gaussians , 2011, The 2011 International Joint Conference on Neural Networks.

[16]  Frank Nielsen,et al.  Entropies and cross-entropies of exponential families , 2010, 2010 IEEE International Conference on Image Processing.

[17]  Anselmo Cardoso de Paiva,et al.  Detection of masses in mammographic images using geometry, Simpson's Diversity Index and SVM , 2010 .

[18]  Frank Nielsen,et al.  The Burbea-Rao and Bhattacharyya Centroids , 2010, IEEE Transactions on Information Theory.

[19]  R. Yeung,et al.  On the Discontinuity of the Shannon Information Measures , 2009, IEEE Transactions on Information Theory.

[20]  Frank Nielsen,et al.  Statistical exponential families: A digest with flash cards , 2009, ArXiv.

[21]  Robert Jenssen,et al.  The Cauchy-Schwarz divergence and Parzen windowing: Connections to graph theory and Mercer kernels , 2006, J. Frankl. Inst..

[22]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[23]  C. Panos,et al.  Information entropy, information distances, and complexity in atoms. , 2005, The Journal of chemical physics.

[24]  E. Seneta Statisticians of the Centuries , 2005 .

[25]  Peter Harremoës,et al.  Inequalities between entropy and index of coincidence derived from information diagrams , 2001, IEEE Trans. Inf. Theory.

[26]  Carolyn Pillers Dobler Statisticians of the Centuries , 2001 .

[27]  L. E. Clarke,et al.  Probability and Measure , 1980 .

[28]  Jana Zvárová,et al.  On generalized entropies, Bayesian decisions and statistical diversity , 2007, Kybernetika.

[29]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[30]  Angel Cataron,et al.  An informational energy LVQ approach for feature ranking , 2004, ESANN.

[31]  Julio Angel Pardo,et al.  Asymptotic distribution of the useful informational energy , 1994, Kybernetika.

[32]  L. Pardo,et al.  Information Energy and Its Aplications , 1991 .

[33]  O. Barndorff-Nielsen Information And Exponential Families , 1970 .

[34]  E. H. Simpson Measurement of Diversity , 1949, Nature.