Maximally informative statistics

In this paper we propose a Bayesian, information theoretic approach to dimensionality reduction. The approach is formulated as a variational principle on mutual information, and seamlessly addresses the notions of sufficiency, relevance, and representation. Maximally informative statistics are shown to minimize a Kullback-Leibler distance between posterior distributions. Illustrating the approach, we derive the maximally informative one dimensional statistic for a random sample from the Cauchy distribution.