Stochastic Global Optimization
暂无分享,去创建一个
Stochastic global optimization methods are methods for solving a global optimization problem incorporating probabilistic (stochastic) elements, either in the problem data (the objective function, the constraints, etc.), or in the algorithm itself, or in both. Global optimization is a very important part of applied mathematics and computer science. The importance of global optimization is primarily related to the applied areas such as engineering, computational chemistry, finance and medicine amongst many other fields. For the state of the art in the theory and methodology of global optimization we refer to the ‘Journal of Global Optimization’ and two volumes of the ‘Handbook of Global Optimization’ [1,2]. If the objective function is given as a ‘black box’ computer code, the optimization problem is especially difficult. Stochastic approaches can often deal with problems of this kind much easier and more efficiently than the deterministic algorithms. The problem of global minimization. Consider a general minimization problem f(x)→minx∈X with objective function f(·) and feasible region X. Let x∗ be a global minimizer of f(·); that is, x∗ is a point in X such that f(x∗) = f∗ where f∗ = minx∈Xf(x). Global optimization problems are usually formulated so that the structure of the feasible region X is relatively simple; this can be done on the expense of increased complexity of the objective function. A global minimization algorithm is a rule for constructing a sequence of points x1, x2, . . . in X such that the sequence of record values yon = mini=1...n f(xi) approaches the minimum f∗ as n increases. In addition to approximating the minimal value f∗, one often needs to approximate at least one of the minimizers x∗. Heuristics. Many stochastic optimization algorithms where randomness is involved have been proposed heuristically. Some of these algorithms are based on analogies with natural processes; the well-known examples are evolutionary algorithms [3] and simulated annealing [4]. Heuristic global optimization algorithms are very popular in applications, especially in discrete optimization problems. Unfortunately, there is a large gap between practical efficiency of stochastic global optimization algorithms and their theoretical rigor. Stochastic assumptions about the objective function. In deterministic global optimization, Lipschitz-type conditions on the objective function are heavily exploited. Much research have been done in stochastic global optimization where stochastic assumptions about the objective function are used in a manner similar to how the Lipschitz condition is used in deterministic algorithms. A typical example of a stochastic assumption of this kind is the postulation that f(·) is a realization of a certain stochastic process. This part of stochastic optimization is well described in [5], Chapter 4 and will not be pursued in this article. Global random search (GRS). The main research in stochastic global optimization deals with the so-called ‘global random search’ (GRS) algorithms which involve random decisions in the process of choosing the observation points. A general GRS algorithm assumes that a sequence of random points x1, x2, . . . , xn is generated where for each j > 1 the point xj has some probability distribution Pj. For each j > 2, the distribution Pj may depend on the previous points x1, . . . , xj−1 and on the results of the objective function evaluations at these points (the function evaluations may not be noise-free). The number of points n, 1 6 n ≤ ∞ (the stopping rule) can be either deterministic or random and may depend on the results of function evaluation at the points x1, . . . , xn.
[1] Emile H. L. Aarts,et al. Simulated Annealing: Theory and Applications , 1987, Mathematics and Its Applications.
[2] A. A. Zhigli︠a︡vskiĭ,et al. Theory of Global Random Search , 1991 .