Interval Analysis: Unconstrained and Constrained Optimization

Interval optimization methods (Interval analysis: Unconstrained and Constrained Optimization) have the guarantee not to loose global optimizer points. To achieve this, a de-terministic branch-and-bound framework is applied. Still heuristic algorithmic improvements may increase the convergence speed while keeping the guaranteed reliability. The indicator parameter called RejectIndex pf * (X) = f * − F (X) F (X) − F (X). was suggested by L. G. Casado as a measure of the closeness of the interval X to a global mini-mizer point [1]. First it was applied to improve the work load balance of global optimization algorithms. A subinterval X of the search space with the minimal value of the inclusion function F (X) is usually considered as the best candidate to contain a global minimum. However, the larger the interval X, the larger the overestimation of the range f (X) on X compared to F (X). Therefore a box could be considered as a good candidate to contain a global minimum just because it is larger than the others. In order to compare subintervals of different size we normalize the distance between the global minimum value f * and F (X). The idea behind pf * is that in general we expect the overestimation to be symmetric, i.e., the overestimation above f (X) is closely equal to the overestimation below f (X) for small subintervals containing a global minimizer point. Hence, for such intervals X the relative place of the global optimum value inside the F (X) interval should be high, while for intervals far from global minimizer points pf * must be small. Obviously, there are exceptions, and there exists no theoretical proof that pf * would be a reliable indicator of nearby global mini-mizer points. The value of the global minimum is not available in most cases. A generalized expression for a wider class of indicators is: p(ˆ f , X) = ˆ f − F (X) F (X) − F (X) , where thê f value is a kind of approximation of the global minimum. We assume thatˆf ∈ F (X), i.e., this estimation is realistic in the sense thatˆf is within the known bounds of the objective function on the search region. According to the numerical experience collected, we need a good approximation of the f * value to improve the efficiency of the algorithm. Subinterval selection. I. Among the possible applications of these indicators the …

[1]  Eldon Hansen,et al.  Global optimization using interval analysis , 1992, Pure and applied mathematics.

[2]  Tibor Csendes,et al.  Multisection in Interval Branch-and-Bound Methods for Global Optimization – I. Theoretical Results , 2000, J. Glob. Optim..

[3]  TIBOR CSENDES,et al.  Numerical Experiences with a New Generalized Subinterval Selection Criterion for Interval Global Optimization , 2003, Reliab. Comput..

[4]  Tibor Csendes,et al.  Multisection in Interval Branch-and-Bound Methods for Global Optimization II. Numerical Tests , 2000, J. Glob. Optim..

[5]  Tibor Csendes,et al.  New Subinterval Selection Criteria for Interval Global Optimization , 2001, J. Glob. Optim..

[6]  R. B. Kearfott Rigorous Global Search: Continuous Problems , 1996 .

[7]  Vladik Kreinovich,et al.  Theoretical Justification of a Heuristic Subbox Selection Criterion , 2001 .

[8]  J D Pinter,et al.  Global Optimization in Action—Continuous and Lipschitz Optimization: Algorithms, Implementations and Applications , 2010 .

[9]  Tibor Csendes,et al.  A New Multisection Technique in Interval Methods for Global Optimization , 2000, Computing.

[10]  Inmaculada García,et al.  New Load Balancing Criterion For Parallel Interval Global Optimization Algorithms , 1998 .

[11]  Tibor Csendes,et al.  Generalized Subinterval Selection Criteria for Interval Global Optimization , 2004, Numerical Algorithms.

[12]  L. G. Casado,et al.  Heuristic Rejection in Interval Global Optimization , 2003 .

[13]  Jon G. Rokne,et al.  New computer methods for global optimization , 1988 .

[14]  M.Cs. Markót,et al.  New interval methods for constrained global optimization , 2006, Math. Program..