On the convergence of global methods in multiextremal optimization

A general class of derivative-free optimization procedures is presented including the corresponding convergence theory. This theory turns out to be very constructive, in the sense that the convergence conditions not only can be verified easily for many existing algorithms, but also allow one to construct new procedures. It is shown that popular methods such as branch-and-bound concepts, Pintér's general class of procedures, the algorithms of Pijavskii, Shubert, and Mladineo, and the approach of Zheng and Galperin can not only be subsumed under this class of methods, but also partly be improved by regarding them within the framework presented.