Values and Bounds for the Common Information of Two Discrete Random Variables

Recently, Wyner has defined a new measure of dependence of two discrete random variables, the common information, with important operational significance. The principal result of this paper is a lower bound on this quantity, for all joint distributions of two n-ary variables. For each n, the bound holds with equality for an n-parameter family of joint distributions. The bound depends on the maximum over all row permutations of the trace of the joint distribution matrix. A bound for joint distributions of zero trace is also obtained and the cases of equality are characterized. The common information of L-shaped distributions is determined.