The new field of adaptive data analysis seeks to provide algorithms and provable guarantees for models of machine learning that allow researchers to reuse their data, which normally falls outside of the usual statistical paradigm of static data analysis. In 2014, Dwork, Feldman, Hardt, Pitassi, Reingold and Roth introduced one potential model and proposed several solutions based on differential privacy. In previous work in 2016, we described a problem with this model and instead proposed a Bayesian variant, but also found that the analogous Bayesian methods cannot achieve the same statistical guarantees as in the static case.
In this paper, we prove the first positive results for the Bayesian model, showing that with a Dirichlet prior, the posterior mean algorithm indeed matches the statistical guarantees of the static case. The main ingredient is a new theorem showing that the $\mathrm{Beta}(\alpha,\beta)$ distribution is subgaussian with variance proxy $O(1/(\alpha+\beta+1))$, a concentration result also of independent interest. We provide two proofs of this result: a probabilistic proof utilizing a simple condition for the raw moments of a positive random variable and a learning-theoretic proof based on considering the beta distribution as a posterior, both of which have implications to other related problems.
[1]
V. V. Buldygin,et al.
The sub-Gaussian norm of a binary random variable
,
2013
.
[2]
V. Buldygin,et al.
Metric characterization of random variables and random processes
,
2000
.
[3]
Richard J. Cleary.
Handbook of Beta Distribution and Its Applications
,
2006
.
[4]
Sam Elder,et al.
Challenges in Bayesian Adaptive Data Analysis
,
2016,
ArXiv.
[5]
Afonso S. Bandeira,et al.
Statistical limits of spiked tensor models
,
2016,
Annales de l'Institut Henri Poincaré, Probabilités et Statistiques.
[6]
Toniann Pitassi,et al.
Preserving Statistical Validity in Adaptive Data Analysis
,
2014,
STOC.
[7]
R. Handel.
Probability in High Dimension
,
2014
.
[8]
Avrim Blum,et al.
The Ladder: A Reliable Leaderboard for Machine Learning Competitions
,
2015,
ICML.
[9]
D. Freedman.
A Note on Screening Regression Equations
,
1983
.