A global search method for optimizing nonlinear systems

The theory and implementation of a global search method of optimization in n dimensions, inspired by Kushner's method in one dimension, are presented. This method is meant to address optimization problems where the function has many extrema, where it may or may not be differentiable, and where it is important to reduce the number of evaluations of the function at the expense of increased computation. Comparisons are made to the performance of other global optimization techniques on a set of standard differentiable test functions. A new class of discrete-valued test functions is introduced, and the performance of the method is determined on a randomly generated set of these functions. Overall, this method has the power of other Bayesian/sampling techniques without the need for a separate local optimization technique for improved convergence. This makes it possible for the search to operate on unknown functions that may contain one or more discrete components. >