Converse Jensen Inequality

We use Skorokhod’s embedding theorem to give a new proof of a converse to Jensen’s inequality. Let (Ω,F ,P) be a probability space and let X be an element of L(Ω,F ,P). Let G ⊂ F be a σ-algebra, and define Y := E[X |G], an element of L(Ω,G,P). If φ : R → R is convex then the conditional form of Jensen’s inequality asserts that E[φ(X)|G] ≥ φ(E[X |G]) = φ(Y ), a.s., part of the assertion being that the expectations are almost surely well-defined. In particular, (1) E[φ(X)] ≥ E[φ(Y )], with an analogous stipulation. The following converse assertion seems to be well known; see [2] and [3]. The proof we present may have some claim to novelty. We write X d = Y to indicate that random variables X and Y have the same distribution. Theorem. Let X and Y be integrable random variables such that (1) holds for all convex φ. Then there is a probability space (Ω,F ,P), a random variable X ′ ∈ L(Ω,F ,P), and a σ-algebra G ⊂ F ′ such that X d = X ′ and Y d = E[X |G]. Proof. Taking φ(x) = x and then φ(x) = −x, we see that E[X ] = E[Y ]. Let B = (Bt) be a onedimensional Brownian motion defined on some filtered probability space (Ω,F ,F ′ t ,P), such that B0 d = Y . By Skorokhod’s theorem, as presented in [1], condition (1) implies that there is an (F ′ t )-stopping time T such that BT d = X . But then E[BT |F ′ 0 ] = B0. Thus we take X ′ = BT , and G ′ = F ′ 0 . References [1] R.V. Chacon and J.B. Walsh: One-dimensional potential embedding. Séminaire de Probabilités, X , pp. 19–23, Lecture Notes in Math. 511, Springer, Berlin, 1976 [2] P.-A. Meyer: Probability and Potentials. Blaisdell, Waltham-Toronto-London, 1966. [3] A. Paszkiewicz: On distributions of conditional expectations. Probab. Math. Stat. 20 (2000) 287-291.