Advancing continuous IDEAs with mixture distributions and factorization selection metrics

Evolutionary optimization based on probabilistic models has so far been limited to the use of factorizations in the case of continuous representations. Furthermore, a maximum complexity parameter K was required previously to construct factorizations to prevent unnecessary complexity to be introduced in the factorization. In this paper, we advance these techniques by using clustering and the EM algorithm to allow for mixture distributions. Furthermore, we apply a search metric to eliminate the K parameter. We use these techniques in the IDEA framework to obtain new continuous evolutionary optimization algorithms and investigate their performance.

[1]  Heinz Mühlenbein,et al.  FDA -A Scalable Evolutionary Algorithm for the Optimization of Additively Decomposed Functions , 1999, Evolutionary Computation.

[2]  Marcus Gallagher,et al.  Real-valued Evolutionary Optimization using a Flexible Probability Density Estimator , 1999, GECCO.

[3]  Jeff A. Bilmes,et al.  A gentle tutorial of the em algorithm and its application to parameter estimation for Gaussian mixture and hidden Markov models , 1998 .

[4]  David E. Goldberg,et al.  Genetic Algorithms, Clustering, and the Breaking of Symmetry , 2000, PPSN.

[5]  Paul A. Viola,et al.  MIMIC: Finding Optima by Estimating Probability Densities , 1996, NIPS.

[6]  David E. Goldberg,et al.  A Survey of Optimization by Building and Using Probabilistic Models , 2002, Comput. Optim. Appl..

[7]  P. Bosman,et al.  Negative log-likelihood and statistical hypothesis testing as the basis of model selection in IDEAs , 2000 .

[8]  Michèle Sebag,et al.  Extending Population-Based Incremental Learning to Continuous Search Spaces , 1998, PPSN.

[9]  Thomas Bäck,et al.  An Overview of Evolutionary Algorithms for Parameter Optimization , 1993, Evolutionary Computation.

[10]  G. Harik Linkage Learning via Probabilistic Modeling in the ECGA , 1999 .

[11]  Rich Caruana,et al.  Removing the Genetics from the Standard Genetic Algorithm , 1995, ICML.

[12]  David E. Goldberg,et al.  Bayesian optimization algorithm, decision graphs, and Occam's razor , 2001 .

[13]  Dirk Thierens,et al.  Expanding from Discrete to Continuous Estimation of Distribution Algorithms: The IDEA , 2000, PPSN.

[14]  Dirk Thierens,et al.  Multi-objective mixture-based iterated density estimation evolutionary algorithms , 2001 .

[15]  P. Bosman,et al.  Continuous iterated density estimation evolutionary algorithms within the IDEA framework , 2000 .