Approximation algorithms for the normalizing constant of Gibbs distributions

Consider a family of distributions $\{\pi_{\beta}\}$ where $X\sim\pi_{\beta}$ means that $\mathbb{P}(X=x)=\exp(-\beta H(x))/Z(\beta)$. Here $Z(\beta)$ is the proper normalizing constant, equal to $\sum_x\exp(-\beta H(x))$. Then $\{\pi_{\beta}\}$ is known as a Gibbs distribution, and $Z(\beta)$ is the partition function. This work presents a new method for approximating the partition function to a specified level of relative accuracy using only a number of samples, that is, $O(\ln(Z(\beta))\ln(\ln(Z(\beta))))$ when $Z(0)\geq1$. This is a sharp improvement over previous, similar approaches that used a much more complicated algorithm, requiring $O(\ln(Z(\beta))\ln(\ln(Z(\beta)))^5)$ samples.