Evolutionary Continuous Optimization by Distribution Estimation with Variational Bayesian Independent Component Analyzers Mixture Model

In evolutionary continuous optimization by building and using probabilistic models, the multivariate Gaussian distribution and their variants or extensions such as the mixture of Gaussians have been used popularly. However, this Gaussian assumption is often violated in many real problems. In this paper, we propose a new continuous estimation of distribution algorithms (EDAs) with the variational Bayesian independent component analyzers mixture model (vbICA-MM) for allowing any distribution to be modeled. We examine how this sophisticated density estimation technique has influence on the performance of the optimization by employing the same selection and population alternation schemes used in the previous EDAs. Our experimental results support that the presented EDAs achieve better performance than previous EDAs with ICA and Gaussian mixture- or kernel-based approaches.

[1]  Dirk Thierens,et al.  Advancing continuous IDEAs with mixture distributions and factorization selection metrics , 2001 .

[2]  Georges R. Harik,et al.  Finding Multimodal Solutions Using Restricted Tournament Selection , 1995, ICGA.

[3]  P. Bosman,et al.  IDEAs based on the normal kernels probability density function , 2000 .

[4]  David E. Goldberg,et al.  A Survey of Optimization by Building and Using Probabilistic Models , 2002, Comput. Optim. Appl..

[5]  Byoung-Tak Zhang,et al.  Evolutionary optimization by distribution estimation with mixtures of factor analyzers , 2002, Proceedings of the 2002 Congress on Evolutionary Computation. CEC'02 (Cat. No.02TH8600).

[6]  Stephen J. Roberts,et al.  Variational Mixture of Bayesian Independent Component Analyzers , 2003, Neural Computation.

[7]  Dirk Thierens,et al.  Expanding from Discrete to Continuous Estimation of Distribution Algorithms: The IDEA , 2000, PPSN.

[8]  Hagai Attias,et al.  Independent Factor Analysis , 1999, Neural Computation.

[9]  N. Allinson,et al.  Population optimization algorithm based on ICA , 2000, 2000 IEEE Symposium on Combinations of Evolutionary Computation and Neural Networks. Proceedings of the First IEEE Symposium on Combinations of Evolutionary Computation and Neural Networks (Cat. No.00.

[10]  Pedro Larrañaga,et al.  A Review on Estimation of Distribution Algorithms , 2002, Estimation of Distribution Algorithms.

[11]  Pedro Larrañaga,et al.  Optimization in Continuous Domains by Learning and Simulation of Gaussian Networks , 2000 .

[12]  David B. Fogel,et al.  Evolutionary algorithms in theory and practice , 1997, Complex.

[13]  Marcus Gallagher,et al.  Real-valued Evolutionary Optimization using a Flexible Probability Density Estimator , 1999, GECCO.

[14]  Byoung-Tak Zhang,et al.  Bayesian evolutionary algorithms for continuous function optimization , 2001, Proceedings of the 2001 Congress on Evolutionary Computation (IEEE Cat. No.01TH8546).

[15]  Dong-Yeon Cho,et al.  Continuous estimation of distribution algorithms with probabilistic principal component analysis , 2001, Proceedings of the 2001 Congress on Evolutionary Computation (IEEE Cat. No.01TH8546).

[16]  D. Goldberg,et al.  Escaping hierarchical traps with competent genetic algorithms , 2001 .

[17]  Christopher M. Bishop Latent Variable Models , 1998, Learning in Graphical Models.

[18]  H. Mühlenbein,et al.  From Recombination of Genes to the Estimation of Distributions I. Binary Parameters , 1996, PPSN.

[19]  Jiri Ocenasek,et al.  Estimation of Distribution Algorithm for Mixed Continuous-Discrete Optimization , 2002 .