Information Geometry Under Monotone Embedding. Part I: Divergence Functions

The standard model of information geometry, expressed as Fisher-Rao metric and Amari-Chensov tensor, reflects an embedding of probability density by \(\log \)-transform. The standard embedding was generalized by one-parametric families of embedding function, such as \(\alpha \)-embedding, q-embedding, \(\kappa \)-embedding. Further generalizations using arbitrary monotone functions (or positive functions as derivatives) include the deformed-log embedding (Naudts), U-embedding (Eguchi), and rho-tau dual embedding (Zhang). Here we demonstrate that the divergence function under the rho-tau dual embedding degenerates, upon taking \(\rho = id\), to that under either deformed-log embedding or U-embedding; hence the latter two give an identical divergence function. While the rho-tau embedding gives rise to the most general form of cross-entropy with two free functions, its entropy reduces to that of deformed entropy of Naudts with only one free function. Fixing the gauge freedom in rho-tau embedding through normalization of dual-entropy function renders rho-tau cross-entropy to degenerate to U cross-entropy of Eguchi, which has the simpler property, not true for general rho-tau cross-entropy, of reducing to the deformed entropy upon setting the two pdfs to be equal. In Part I, we investigate monotone embedding in divergence function, entropy and cross-entropy, whereas in the sequel (Part II), in induced geometries and probability families.