Analyzing the Components of Distributed Coevolutionary GAN Training

Distributed coevolutionary Generative Adversarial Network (GAN) training has empirically shown success in overcoming GAN training pathologies. This is mainly due to diversity maintenance in the populations of generators and discriminators during the training process. The method studied here coevolves sub-populations on each cell of a spatial grid organized into overlapping Moore neighborhoods. We investigate the impact on the performance of two algorithm components that influence the diversity during coevolution: the performance-based selection/replacement inside each sub-population and the communication through migration of solutions (networks) among overlapping neighborhoods. In experiments on MNIST dataset, we find that the combination of these two components provides the best generative models. In addition, migrating solutions without applying selection in the sub-populations achieves competitive results, while selection without communication between cells reduces performance.

[1]  Behnam Neyshabur,et al.  Stabilizing GAN Training with Multiple Random Projections , 2017, ArXiv.

[2]  Una-May O'Reilly,et al.  Re-purposing heterogeneous generative ensembles with evolutionary computation , 2020, GECCO.

[3]  Trung Le,et al.  Dual Discriminator Generative Adversarial Nets , 2017, NIPS.

[4]  Joost van de Weijer,et al.  Ensembles of Generative Adversarial Networks , 2016, ArXiv.

[5]  Edwin D. de Jong,et al.  Coevolutionary Principles , 2012, Handbook of Natural Computing.

[6]  Soumith Chintala,et al.  Unsupervised Representation Learning with Deep Convolutional Generative Adversarial Networks , 2015, ICLR.

[7]  Tatjana Chavdarova,et al.  SGAN: An Alternative Training of Generative Adversarial Networks , 2017, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.

[8]  Yann LeCun,et al.  The mnist database of handwritten digits , 2005 .

[9]  Suvrit Sra,et al.  Distributional Adversarial Networks , 2017, ICLR.

[10]  Abdullah Al-Dujaili,et al.  Lipizzaner: A System That Scales Robust Generative Adversarial Network Training , 2018, ArXiv.

[11]  Lhassane Idoumghar,et al.  GPU parallelization strategies for metaheuristics: a survey , 2019, Int. J. Parallel Emergent Distributed Syst..

[12]  Bernhard Schölkopf,et al.  AdaGAN: Boosting Generative Models , 2017, NIPS.

[13]  El-Ghazali Talbi,et al.  Metaheuristics - From Design to Implementation , 2009 .

[14]  Enrique Alba,et al.  Parallel metaheuristics: recent advances and new trends , 2012, Int. Trans. Oper. Res..

[15]  Yoshua Bengio,et al.  Generative Adversarial Nets , 2014, NIPS.

[16]  Tom Schaul,et al.  Natural Evolution Strategies , 2008, 2008 IEEE Congress on Evolutionary Computation (IEEE World Congress on Computational Intelligence).

[17]  Raymond Y. K. Lau,et al.  Least Squares Generative Adversarial Networks , 2016, 2017 IEEE International Conference on Computer Vision (ICCV).

[18]  Una-May O'Reilly,et al.  Data Dieting in GAN Training , 2020, Deep Neural Evolution.

[19]  L. Darrell Whitley,et al.  Cellular Genetic Algorithms , 1993, ICGA.

[20]  Jerry Li,et al.  Towards Understanding the Dynamics of Generative Adversarial Networks , 2017, ArXiv.

[21]  Xi Chen,et al.  Evolution Strategies as a Scalable Alternative to Reinforcement Learning , 2017, ArXiv.

[22]  Christoph Meinel,et al.  Dropout-GAN: Learning from a Dynamic Ensemble of Discriminators , 2018, ArXiv.

[23]  Léon Bottou,et al.  Wasserstein GAN , 2017, ArXiv.

[24]  Abdullah Al-Dujaili,et al.  Towards Distributed Coevolutionary GANs , 2018, ArXiv.

[25]  Una-May O'Reilly,et al.  Spatial evolutionary generative adversarial networks , 2019, GECCO.

[26]  Sepp Hochreiter,et al.  GANs Trained by a Two Time-Scale Update Rule Converge to a Local Nash Equilibrium , 2017, NIPS.

[27]  Xin Yao,et al.  Evolutionary Generative Adversarial Networks , 2018, IEEE Transactions on Evolutionary Computation.

[28]  Yann LeCun,et al.  Energy-based Generative Adversarial Network , 2016, ICLR.

[29]  Yingyu Liang,et al.  Generalization and Equilibrium in Generative Adversarial Nets (GANs) , 2017, ICML.

[30]  Una-May O'Reilly,et al.  Maintenance of a Long Running Distributed Genetic Programming System for Solving Problems Requiring Big Data , 2013, GPTP.