A framework for optimization using approximate functions

Population-based, stochastic, zero-order optimization methods (e.g. genetic and evolutionary algorithms) are a popular choice in solving intractable, real-life optimization problems. These methods are particularly attractive as they are easy to use and do not require assumptions about functional and slope continuities unlike some of its gradient-based counterparts. Despite their advantages, these methods require the evaluation of numerous candidate solutions, which is often computationally expensive and practically prohibitive. We introduce a framework for optimization using approximate functions. The optimization algorithm is a population-based, stochastic, zero-order, elite-preserving algorithm that makes use of approximate function evaluations in lieu of actual function evaluations. The approximate function is constructed using a radial basis function (RBF) network and the network is periodically retrained after a few generations unlike other models which create and use the same approximate model repeatedly without retraining. A scheme for controlled elitism is incorporated within the optimization framework to ensure convergence in the actual function space. The computational accuracy and efficiency of the proposed optimization framework is assessed using a set of five mathematical test functions. The results clearly indicate that the optimization framework using approximations is able to arrive at reasonably accurate results using only a fraction of actual functions evaluations.

[1]  David E. Goldberg,et al.  Genetic Algorithms and Walsh Functions: Part II, Deception and Its Analysis , 1989, Complex Syst..

[2]  Kurt Hornik,et al.  Multilayer feedforward networks are universal approximators , 1989, Neural Networks.

[3]  William E. Hart,et al.  Optimization with genetic algorithm hybrids that use local searches , 1996 .

[4]  S. Kobayashi,et al.  Theoretical analysis of the unimodal normal distribution crossover for real-coded genetic algorithms , 1998, 1998 IEEE International Conference on Evolutionary Computation Proceedings. IEEE World Congress on Computational Intelligence (Cat. No.98TH8360).

[5]  Alain Ratle,et al.  Accelerating the Convergence of Evolutionary Algorithms by Fitness Landscape Approximation , 1998, PPSN.

[6]  Shigenobu Kobayashi,et al.  A robust real-coded genetic algorithm using Unimodal Normal Distribution Crossover augmented by Uniform Crossover: effects of self-adaptation of crossover probabilities , 1999 .

[7]  Bernhard Sendhoff,et al.  On Evolutionary Optimization with Approximate Fitness Functions , 2000, GECCO.

[8]  Shigenobu Kobayashi,et al.  Extrapolation-directed crossover for real-coded GA: overcoming deceptive phenomena by extrapolative search , 2001, Proceedings of the 2001 Congress on Evolutionary Computation (IEEE Cat. No.01TH8546).

[9]  Kalyanmoy Deb,et al.  A Computationally Efficient Evolutionary Algorithm for Real-Parameter Optimization , 2002, Evolutionary Computation.

[10]  Bernhard Sendhoff,et al.  A framework for evolutionary optimization with approximate fitness functions , 2002, IEEE Trans. Evol. Comput..

[11]  T. Poggio,et al.  Networks and the best approximation property , 1990, Biological Cybernetics.

[12]  Francisco Herrera,et al.  Tackling Real-Coded Genetic Algorithms: Operators and Tools for Behavioural Analysis , 1998, Artificial Intelligence Review.