Convergence rates of a global optimization algorithm

This paper presents a best and worst case analysis of convergence rates for a deterministic global optimization algorithm. Superlinear convergence is proved for Lipschitz functions which are convex in the direction of the global maximum (concave in the direction of the global minimum). Computer results are given, which confirm the theoretical convergence rates.