On strong homogeneity of a class of global optimization algorithms working with infinite and infinitesimal scales

Abstract The necessity to find the global optimum of multiextremal functions arises in many applied problems where finding local solutions is insufficient. One of the desirable properties of global optimization methods is strong homogeneity meaning that a method produces the same sequences of points where the objective function is evaluated independently both of multiplication of the function by a scaling constant and of adding a shifting constant. In this paper, several aspects of global optimization using strongly homogeneous methods are considered. First, it is shown that even if a method possesses this property theoretically, numerically very small and large scaling constants can lead to ill-conditioning of the scaled problem. Second, a new class of global optimization problems where the objective function can have not only finite but also infinite or infinitesimal Lipschitz constants is introduced. Third, the strong homogeneity of several Lipschitz global optimization algorithms is studied in the framework of the Infinity Computing paradigm allowing one to work numerically with a variety of infinities and infinitesimals. Fourth, it is proved that a class of efficient univariate methods enjoys this property for finite, infinite and infinitesimal scaling and shifting constants. Finally, it is shown that in certain cases the usage of numerical infinities and infinitesimals can avoid ill-conditioning produced by scaling. Numerical experiments illustrating theoretical results are described.

[1]  C. D. Perttunen,et al.  Lipschitzian optimization without the Lipschitz constant , 1993 .

[2]  János D. Pintér,et al.  Global Optimization: Software, Test Problems, and Applications , 2002 .

[3]  Marco Cococcioni,et al.  Lexicographic multi-objective linear programming using grossone methodology: Theory and algorithm , 2018, Appl. Math. Comput..

[4]  Yaroslav D. Sergeyev Using Blinking Fractals for Mathematical Modeling of Processes of Growth in Biological Systems , 2011, Informatica.

[5]  Anatoly A. Zhigljavsky,et al.  Computing sums of conditionally convergent and divergent series using the concept of grossone , 2012, Appl. Math. Comput..

[6]  Manlio Gaudioso,et al.  Numerical infinitesimals in a variable metric method for convex nonsmooth optimization , 2018, Appl. Math. Comput..

[7]  Remigijus Paulavičius,et al.  Simplicial Global Optimization , 2014 .

[8]  Maurice Margenstern Fibonacci words, hyperbolic tilings and grossone , 2015, Commun. Nonlinear Sci. Numer. Simul..

[9]  Louis D'Alotto,et al.  Cellular automata using infinite computations , 2011, Appl. Math. Comput..

[10]  Y. Sergeyev,et al.  Operational zones for comparing metaheuristic and deterministic one-dimensional global optimization algorithms , 2017, Math. Comput. Simul..

[11]  Renato De Leone,et al.  Nonlinear programming and Grossone: Quadratic Programing and the role of Constraint Qualifications , 2018, Appl. Math. Comput..

[12]  A. ilinskas Axiomatic characterization of a global optimization algorithm and investigation of its search strategy , 1985 .

[13]  S. A. Piyavskii An algorithm for finding the absolute extremum of a function , 1972 .

[14]  S. M. Elsakov,et al.  Homogeneous algorithms for multiextremal optimization , 2010 .

[15]  Yaroslav D. Sergeyev,et al.  Higher order numerical differentiation on the Infinity Computer , 2011, Optim. Lett..

[16]  Renato De Leone,et al.  The use of grossone in Mathematical Programming and Operations Research , 2011, Appl. Math. Comput..

[17]  Yaroslav D. Sergeyev,et al.  Numerical Methods for Solving Initial Value Problems on the Infinity Computer , 2016, Int. J. Unconv. Comput..

[18]  D. I. Iudin,et al.  Infinity computations in cellular automaton forest-fire model , 2015, Commun. Nonlinear Sci. Numer. Simul..

[19]  Dmitri E. Kvasov,et al.  Metaheuristic vs. deterministic global optimization algorithms: The univariate case , 2018, Appl. Math. Comput..

[20]  A. Robinson Non-standard analysis , 1966 .

[21]  Yaroslav D. Sergeyev,et al.  A generalized Taylor method of order three for the solution of initial value problems in standard and infinity floating-point arithmetic , 2017, Math. Comput. Simul..

[22]  Yaroslav D. Sergeyev,et al.  A New Applied Approach for Executing Computations with Infinite and Infinitesimal Quantities , 2008, Informatica.

[23]  Jonas Mockus,et al.  Bayesian Approach to Global Optimization , 1989 .

[24]  Antanas Zilinskas,et al.  A hybrid global optimization algorithm for non-linear least squares regression , 2012, Journal of Global Optimization.

[25]  Antanas Zilinskas,et al.  On strong homogeneity of two global optimization algorithms based on statistical models of multimodal objective functions , 2011, Appl. Math. Comput..

[26]  J. W. Gillard,et al.  Deterministic global optimization: an introduction to the diagonal approach , 2018, Optim. Methods Softw..

[27]  A. A. Zhigli︠a︡vskiĭ,et al.  Stochastic Global Optimization , 2007 .

[28]  Y. Sergeyev On convergence of "divide the best" global optimization algorithms , 1998 .

[29]  V. A. Grishagin,et al.  A parallel method for finding the global minimum of univariate functions , 1994 .

[30]  Y. D. Sergeyev,et al.  Global Optimization with Non-Convex Constraints - Sequential and Parallel Algorithms (Nonconvex Optimization and its Applications Volume 45) (Nonconvex Optimization and Its Applications) , 2000 .

[31]  A. Zhigljavsky Stochastic Global Optimization , 2008, International Encyclopedia of Statistical Science.

[32]  Tsuyoshi Murata,et al.  {m , 1934, ACML.

[33]  Yaroslav D. Sergeyev,et al.  Numerical infinities and infinitesimals: Methodology, applications, and repercussions on two Hilbert problems , 2017 .

[34]  Fabio Caldarola The Sierpinski curve viewed by numerical computations with infinities and infinitesimals , 2018, Appl. Math. Comput..

[35]  Vladimir A. Grishagin,et al.  Adaptive nested optimization scheme for multidimensional global search , 2016, J. Glob. Optim..

[36]  Vladimir A. Grishagin,et al.  Convergence conditions and numerical comparison of global optimization methods based on dimensionality reduction schemes , 2018, Appl. Math. Comput..

[37]  Roman G. Strongin,et al.  Introduction to Global Optimization Exploiting Space-Filling Curves , 2013 .

[38]  Yaroslav D. Sergeyev,et al.  Derivative-Free Local Tuning and Local Improvement Techniques Embedded in the Univariate Global Optimization , 2016, J. Optim. Theory Appl..

[39]  Alfredo Garro,et al.  Observability of Turing Machines: A Refinement of the Theory of Computation , 2010, Informatica.

[40]  Yaroslav D. Sergeyev,et al.  Numerical point of view on Calculus for functions assuming finite, infinite, and infinitesimal values over finite, infinite, and infinitesimal domains , 2009, 1203.4140.