Environment identification-based memory scheme for estimation of distribution algorithms in dynamic environments

In estimation of distribution algorithms (EDAs), the joint probability distribution of high-performance solutions is presented by a probability model. This means that the priority search areas of the solution space are characterized by the probability model. From this point of view, an environment identification-based memory management scheme (EI-MMS) is proposed to adapt binary-coded EDAs to solve dynamic optimization problems (DOPs). Within this scheme, the probability models that characterize the search space of the changing environment are stored and retrieved to adapt EDAs according to environmental changes. A diversity loss correction scheme and a boundary correction scheme are combined to counteract the diversity loss during the static evolutionary process of each environment. Experimental results show the validity of the EI-MMS and indicate that the EI-MMS can be applied to any binary-coded EDAs. In comparison with three state-of-the-art algorithms, the univariate marginal distribution algorithm (UMDA) using the EI-MMS performs better when solving three decomposable DOPs. In order to understand the EI-MMS more deeply, the sensitivity analysis of parameters is also carried out in this paper.

[1]  Jonathan L. Shapiro Scaling of Probability-Based Optimization Algorithms , 2002, NIPS.

[2]  Xin Yao,et al.  Population-Based Incremental Learning With Associative Memory for Dynamic Environments , 2008, IEEE Transactions on Evolutionary Computation.

[3]  Helen G. Cobb,et al.  An Investigation into the Use of Hypermutation as an Adaptive Operator in Genetic Algorithms Having Continuous, Time-Dependent Nonstationary Environments , 1990 .

[4]  Rasmus K. Ursem,et al.  Multinational GAs: Multimodal Optimization Techniques in Dynamic Environments , 2000, GECCO.

[5]  R.W. Morrison,et al.  A test problem generator for non-stationary environments , 1999, Proceedings of the 1999 Congress on Evolutionary Computation-CEC99 (Cat. No. 99TH8406).

[6]  Jürgen Branke,et al.  Evolutionary optimization in uncertain environments-a survey , 2005, IEEE Transactions on Evolutionary Computation.

[7]  Hajime Kita,et al.  Adaptation to a Changing Environment by Means of the Feedback Thermodynamical Genetic Algorithm , 1996, PPSN.

[8]  Shengxiang Yang,et al.  Non-stationary problem optimization using the primal-dual genetic algorithm , 2003, The 2003 Congress on Evolutionary Computation, 2003. CEC '03..

[9]  Jürgen Branke,et al.  Proceedings of the Workshop on Evolutionary Algorithms for Dynamic Optimization Problems (EvoDOP-2003) held in conjunction with the Genetic and Evolutionary Computation Conference (GECCO-2003), 12 July 2003, Chicago, USA [online] , 2003 .

[10]  Mark Wineberg,et al.  Enhancing the GA's Ability to Cope with Dynamic Environments , 2000, GECCO.

[11]  Shengxiang Yang,et al.  Associative Memory Scheme for Genetic Algorithms in Dynamic Environments , 2006, EvoWorkshops.

[12]  Jürgen Branke,et al.  Addressing sampling errors and diversity loss in UMDA , 2007, GECCO '07.

[13]  Jonathan L. Shapiro,et al.  Drift and Scaling in Estimation of Distribution Algorithms , 2005, Evolutionary Computation.

[14]  Jürgen Branke,et al.  Evolutionary Optimization in Dynamic Environments , 2001, Genetic Algorithms and Evolutionary Computation.

[15]  David E. Goldberg,et al.  Bayesian Optimization Algorithm: From Single Level to Hierarchy , 2002 .

[16]  Jonathan L. Shapiro,et al.  Diversity Loss in General Estimation of Distribution Algorithms , 2006, PPSN.

[17]  L. Darrell Whitley,et al.  Fundamental Principles of Deception in Genetic Search , 1990, FOGA.

[18]  Ronald W. Morrison,et al.  Designing Evolutionary Algorithms for Dynamic Environments , 2004, Natural Computing Series.

[19]  H. Mühlenbein,et al.  From Recombination of Genes to the Estimation of Distributions I. Binary Parameters , 1996, PPSN.

[20]  Hajime Kita,et al.  Adaptation to a Changing Environment by Means of the Thermodynamical Genetic Algorithm , 1999 .

[21]  Jürgen Branke,et al.  A Multi-population Approach to Dynamic Optimization Problems , 2000 .

[22]  Shengxiang Yang,et al.  Memory-enhanced univariate marginal distribution algorithms for dynamic optimization problems , 2005, 2005 IEEE Congress on Evolutionary Computation.

[23]  Jürgen Branke,et al.  Memory enhanced evolutionary algorithms for changing optimization problems , 1999, Proceedings of the 1999 Congress on Evolutionary Computation-CEC99 (Cat. No. 99TH8406).

[24]  W. Cedeno,et al.  On the use of niching for dynamic landscapes , 1997, Proceedings of 1997 IEEE International Conference on Evolutionary Computation (ICEC '97).

[25]  Xin Yao,et al.  Experimental study on population-based incremental learning algorithms for dynamic optimization problems , 2005, Soft Comput..

[26]  Shengxiang Yang,et al.  Population-based incremental learning with memory scheme for changing environments , 2005, GECCO '05.

[27]  John J. Grefenstette,et al.  Genetic Algorithms for Changing Environments , 1992, PPSN.

[28]  David E. Goldberg,et al.  The Design of Innovation: Lessons from and for Competent Genetic Algorithms , 2002 .