An extended Čencov characterization of the information metric

Cencov has shown that Riemannian metrics which are derived from the Fisher information matrix are the only metrics which preserve inner products under certain probabilistically important mappings. In (encov's theorem, the underlying differentiable manifold is the probability simplex E'xi = 1, xi > 0. For some purposes of using geometry to obtain insights about probability, it is more convenient to regard the simplex as a hypersurface in the positive cone. In the present paper Cencov's result is extended to the positive cone. The proof uses standard techniques of differential geometry but does not use the language of category theory.