In this paper a new stochastic algorithm for function optimization is presented. Called Generalized Extremal Optimization, it was inspired by the theory of SelfOrganized Criticality and is intended to be used in complex inverse design problems, where traditional gradient based optimization methods may become inefficient. Preliminary results from a set of test functions show that this algorithm can be competitive to other stochastic methods such as the genetic algorithms. NOMENCLATURE k Index of bit rank. L Length of binary string that encodes the design variables. l Length of binary string for one design variable. N Number of design variables. V Value of the objective function for a given binary string. x Design variable. ∆V Bit fitness. τ Free adjustable parameter of the optimization algorithm. INTRODUCTION Stochastic algorithms inspired by nature have been successfully used for tackling optimization problems in engineering and science. Simulated Annealing (SA) and Genetic Algorithms (GAs) are probably the two methods most used. Their robustness and ability to be easily implemented to a broad class of problems, regardless of such difficulties as the presence of multiple local minima in the design space and the mixing of continuous and discrete variables, has made them good tools to tackle complex problems, for example, in the aerospace field. The main disadvantage of these methods is that they usually need a great number of objective function evaluations to be effective. Hence, in problems where the calculation of the objective function is very time consuming, these methods may become impracticable. Nevertheless, the availability of fast computing resources or the use of hybrid techniques has made the power of those algorithms available even to that kind of problems. There are today many derivatives of the SA and GAs methods, created to give more efficiency to the proposed original algorithms, but that keep essentially their same principles. Recently, Boettcher and Percus have proposed a new optimization method based on a simplified model of biological evolution developed to show the emergence of Self-Organized Criticality (SOC) in ecosystems. Called Extremal Optimization (EO), it has been successfully applied to tackle hard problems in combinatorial optimization. Although algorithms such as SA, GAs and the EO are inspired by natural processes, their practical implementation to optimization problems shares a common feature: the search for the optimal is done through a stochastic process that is “guided” by the setting of adjustable parameters. Since the proper setting of these parameters are very important to the performance of the algorithms, it is highly desirable that they have few of such parameters, so that the cost of finding the best set to a given optimization problem does not become a costly task in itself. The EO algorithm has only one adjustable parameter. This may be an “a priori” advantage over the SA and GA algorithms, since they use more than one. In this paper the Generalized Extremal Optimization (GEO) algorithm is presented. The GEO algorithm is built over the EO method, but the way it is implemented allows it to be readily applied to a broad class of engineering problems. The algorithm is of easy implementation, does not make use of derivatives and can be applied to nonconvex or disjoint problems. It can also deal in principle with any kind of variable, either continuous, discrete or integer. All these features make it suitable to be used in complex inverse design problems, where traditional gradient methods could not be applied properly due to, for example, the presence of multiple local minima or use of mixed types of design variables. In this work the performance of the GEO algorithm is tested in a set of non-linear multimodal functions used commonly to test GAs. The performance of the GEO algorithm for these functions is compared with the ones for a standard GA and the Cooperative CoFabiano Luis de Sousa INPE-DMC Av. dos Astronautas, 1758 S.J.Campos, 12227-010, Brazil Email: fabiano@dem.inpe.br Fernando Manuel Ramos INPE-LAC Av. dos Astronautas, 1758 S.J.Campos, 12227-010, Brazil Email: fernando@lac.inpe.br 4 International Conference on Inverse Problems in Engineering Rio de Janeiro, Brazil, 2002 2 evolutionary GA (CCGA) proposed by Potter and De Jong. THE EXTREMAL OPTIMIZATION ALGORITHM Self-organized criticality has been used to explain the behavior of complex systems in such different areas as geology, economy and biology. The theory of SOC states that large interactive systems evolves naturally to a critical state where a single change in one of its elements generates “avalanches” that can reach any number of elements on the system. The probability distribution of the sizes “s” of these avalanches is described by a power law in the form P(s) ~ s , where γ is a positive parameter. That is, smaller avalanches are more likely to occur than big ones, but even avalanches as big as the whole system may occur with a non-negligible probability. To show that SOC could explain features of systems like the natural evolution, Bak and Sneepen developed a simplified model of an ecosystem in which species are placed side by side on a line with periodic boundary conditions. To each species, a fitness number is assigned randomly, with uniform distribution, in the range [0,1]. The least adapted species, the one with the least fitness, is then forced to mutate, and a new random number assigned to it. The change in the fitness of the least adapted species alters the fitness landscape of their neighbors, and to cope with that new random numbers are also assigned to them, even if they are well adapted. After some iterations, the system evolves to a critical state where all species have fitness above a critical threshold. However, the dynamics of the system eventually causes a number of species to fall below the critical threshold in avalanches that can be as big as the whole system. An optimization heuristic based on a dynamic search that embodies SOC would evolve solutions quickly, systematically mutating the worst individuals. At the same time this approach would preserve throughout the search process, the possibility of probing different regions of the design space (via avalanches), enabling the algorithm to escape local optima. Inspired by the SOC theory, the basic EO algorithm was proposed as follows: 1.Initialize configuration C of design variables xi at will; set Cbest = C. 2. For the current configuration C, a) set a fitness Fi to each variable xi, b) find j satisfying Fj ≤ Fi for all i, c) choose C’ in a neighborhood N(C) of C so that xj must change, d) accept C = C’ unconditionally, e) if F(C) < F(Cbest) then set Cbest = C. 3. Repeat step (2) as long as desired. 4. Return Cbest and F(Cbest). The above algorithm shows good performance on problems, such as graph partitioning, where it can choose new configurations randomly among neighborhoods of C, while satisfying step 2c. But when applied to other types of problems, it can lead to a deterministic search. To overcome this, the algorithm was modified as follows: in step 2b the N variables xi are ranked so that to the variable with the least fitness is assigned rank 1, and to the one with the best fitness rank N. Each time the algorithm passes through step 2c a variable is chosen to be mutated according to a probability distribution of the k ranks, given by: P(k) = k τ , 1 ≤ k ≤ N , (1) where τ is a positive adjustable parameter. For τ → 0, the algorithm becomes a random walk, while for τ → ∞, we have a deterministic search. The introduction of the parameter τ, allows the algorithm to choose any variable to mutate, but privileging the ones with low fitness. This implementation of the EO method received the name τEO algorithm, and showed superior performance to the standard implementation even in cases where the basic EO algorithm would not lead to local minima. As pointed out by Boettcher and Percus, “a drawback of the EO method is that a general definition of fitness for the individual variables may prove ambiguous or even impossible”. What means that for each new optimization problem assessed, a new way to assign the fitness to the design variables may have to be created. Moreover, to our knowledge it has been applied so far to combinatorial problems with no implementation to continuos functions. In order to make the EO method applicable to a broad class of design optimization problems, without concern to how the fitness of the design variables would be assigned and capable to tackle either continuos, discrete or integer variables, a generalization of the EO, called Generalized Extremal Optimization, was devised. In this new algorithm, the fitness assignment is not done directly to the design variables, but to a “population of species” that encodes the variables. Each species receives its fitness, and eventually mutates, following general rules. The GEO algorithm is described in the next Section. THE GENERALIZED EXTREMAL OPTIMIZATION ALGORITHM We devised the GEO algorithm using the same logic of the evolutionary model of Bak and Sneppen, but applying the τ-EO approach to choose the species that will mutate. Following Bak and Sneppen, L species are aligned and for each species is assigned a fitness value that will determine the species that are more prone to mutate. We can think of these species as bits that can assume the values of 0 or 1. Hence, the entire population would consist of a single binary string. The design variables of the optimization problem are encoded in this string that would be similar to a chromosome in a canonical GA, but with each bit considered as a species or individual, as shown in Figure 1. 4 International Conference on Inverse Problems in Engineering Rio de Janeiro, Brazil, 2002 3 To each species (bit) is assigned a fitness number that is proportional to the gain (or loss) the objective function value has in mutating
[1]
David W. Miller,et al.
Assessing the performance of a heuristic simulated annealing algorithm for the design of distributed satellite systems
,
2001
.
[2]
K. Krishnakumar,et al.
Applications of evolutionary algorithms to aerospace problems : A survey
,
1996
.
[3]
R. B. Patil,et al.
SALO: COMBINING SIMULATED ANNEALING AND LOCAL OPTIMIZATION FOR EFFICIENT GLOBAL OPTIMIZATION
,
1996
.
[4]
Wallace T. Fowler,et al.
Interplanetary Flyby Mission Optimization Using a Hybrid Global-Local Search Method
,
2000
.
[5]
Stefan Boettcher,et al.
Optimization with extremal dynamics
,
2003,
Complex..
[6]
William A. Crossley,et al.
Aerodynamic and Aeroacoustic Optimization of Rotorcraft Airfoils via a Parallel Genetic Algorithm
,
2000
.
[7]
Kenneth A. De Jong,et al.
A Cooperative Coevolutionary Approach to Function Optimization
,
1994,
PPSN.
[8]
D. Wolpert,et al.
No Free Lunch Theorems for Search
,
1995
.
[9]
Stephen D. Heister,et al.
Application of a Genetic Algorithm to the Optimization of Hybrid Rockets
,
2000
.
[10]
Domenico Quagliarella,et al.
Airfoil and wing design through hybrid optimization strategies
,
1998
.
[11]
C. D. Gelatt,et al.
Optimization by Simulated Annealing
,
1983,
Science.
[12]
M. Damodaran,et al.
Aerodynamic Shape Optimization Using Computational Fluid Dynamics and Parallel Simulated Annealing Algorithms
,
2001
.
[13]
Bak,et al.
Punctuated equilibrium and criticality in a simple model of evolution.
,
1993,
Physical review letters.
[14]
David E. Goldberg,et al.
Genetic Algorithms in Search Optimization and Machine Learning
,
1988
.