Sub-Gaussian Mean Estimation in Polynomial Time

We study polynomial time algorithms for estimating the mean of a random vector $X$ in $\mathbb{R}^d$ from $n$ independent samples $X_1,\ldots,X_n$ when $X$ may be heavy-tailed. We assume only that $X$ has finite mean $\mu$ and covariance $\Sigma$. In this setting, the radius of confidence intervals achieved by the empirical mean are large compared to the case that $X$ is Gaussian or sub-Gaussian. In particular, for confidence $\delta > 0$, the empirical mean has confidence intervals with radius of order $\sqrt{\text{Tr} \Sigma / \delta n}$ rather than $\sqrt{\text{Tr} \Sigma /n } + \sqrt{ \lambda_{\max}(\Sigma) \log (1/\delta) / n}$ from the Gaussian case. We offer the first polynomial time algorithm to estimate the mean with sub-Gaussian confidence intervals under such mild assumptions. Our algorithm is based on a new semidefinite programming relaxation of a high-dimensional median. Previous estimators which assumed only existence of $O(1)$ moments of $X$ either sacrifice sub-Gaussian performance or are only known to be computable via brute-force search procedures requiring $\exp(d)$ time.