The Evolutionary Buffet Method

Within the field of Genetic Algorithms (GA) and Artificial Intelligence (AI) a variety computational substrates with the power to find solutions to a large variety of problems have been described. Research has specialized on different computational substrates that each excel in different problem domains. For example, Artificial Neural Networks (ANN) (Russell et al., Artificial intelligence: a modern approach, vol 2. Prentice Hall, Upper Saddle River, 2003) have proven effective at classification, Genetic Programs (by which we mean mathematical tree-based genetic programming and will abbreviate with GP) (Koza, Stat Comput 4:87–112, 1994) are often used to find complex equations to fit data, Neuro Evolution of Augmenting Topologies (NEAT) (Stanley and Miikkulainen, Evolut Comput 10:99–127, 2002) is good at robotics control problems (Cully et al., Nature 521:503, 2015), and Markov Brains (MB) (Edlund et al., PLoS Comput Biol 7:e1002,236, 2011; Marstaller et al., Neural Comput 25:2079–2107, 2013; Hintze et al., Markov brains: a technical introduction. arXiv:1709.05601, 2017) are used to test hypotheses about evolutionary behavior (Olson et al., J R Soc Interf 10:20130,305, 2013) (among many other examples). Given the wide range of problems and vast number of computational substrates practitioners of GA and AI face the difficulty that every new problem requires an assessment to find an appropriate computational substrates and specific parameter tuning to achieve optimal results.

[1]  C. Titus Brown,et al.  Evolutionary Learning in the 2D Artificial Life System "Avida" , 1994, adap-org/9405003.

[2]  Kenneth O. Stanley,et al.  A Hypercube-Based Encoding for Evolving Large-Scale Neural Networks , 2009, Artificial Life.

[3]  William F. Punch,et al.  Parameter-less population pyramid , 2014, GECCO.

[4]  Sebastian Risi,et al.  HyperENTM: Evolving Scalable Neural Turing Machines through HyperNEAT , 2017, ArXiv.

[5]  Arend Hintze,et al.  Evolution of Integrated Causal Structures in Animats Exposed to Environments of Increasing Complexity , 2014, PLoS Comput. Biol..

[6]  Geoffrey E. Hinton,et al.  Outrageously Large Neural Networks: The Sparsely-Gated Mixture-of-Experts Layer , 2017, ICLR.

[7]  Peter Norvig,et al.  Artificial Intelligence: A Modern Approach , 1995 .

[8]  Antoine Cully,et al.  Robots that can adapt like animals , 2014, Nature.

[9]  Arend Hintze,et al.  Evolving autonomous learning in cognitive networks , 2017, Scientific Reports.

[10]  Julian Francis Miller Cartesian Genetic Programming , 2011, Cartesian Genetic Programming.

[11]  Charles Ofria,et al.  Early Evolution of Memory Usage in Digital Organisms , 2010, ALIFE.

[12]  Michael I. Jordan Serial Order: A Parallel Distributed Processing Approach , 1997 .

[13]  D. Wolpert,et al.  No Free Lunch Theorems for Search , 1995 .

[14]  Kevin Leyton-Brown,et al.  Auto-WEKA: combined selection and hyperparameter optimization of classification algorithms , 2012, KDD.

[15]  Arend Hintze,et al.  Information-theoretic neuro-correlates boost evolution of cognitive systems , 2015, Entropy.

[16]  David H. Wolpert,et al.  Coevolutionary free lunches , 2005, IEEE Transactions on Evolutionary Computation.

[17]  Arend Hintze,et al.  Evolution of Autonomous Hierarchy Formation and Maintenance , 2014, ALIFE.

[18]  Arend Hintze,et al.  The Evolution of Representation in Simple Cognitive Networks , 2012, Neural Computation.

[19]  David H. Wolpert,et al.  The Lack of A Priori Distinctions Between Learning Algorithms , 1996, Neural Computation.

[20]  Arend Hintze,et al.  Evolutionary game theory using agent-based methods. , 2014, Physics of life reviews.

[21]  Jeffrey L. Elman,et al.  Finding Structure in Time , 1990, Cogn. Sci..

[22]  Leslie Pack Kaelbling,et al.  Planning and Acting in Partially Observable Stochastic Domains , 1998, Artif. Intell..

[23]  Luis Muñoz,et al.  NEAT, There's No Bloat , 2014, EuroGP.

[24]  John R. Koza,et al.  Genetic programming as a means for programming computers by natural selection , 1994 .

[25]  David H. Wolpert,et al.  No free lunch theorems for optimization , 1997, IEEE Trans. Evol. Comput..

[26]  Arend Hintze,et al.  Integrated Information Increases with Fitness in the Evolution of Animats , 2011, PLoS Comput. Biol..

[27]  Richard S. Sutton,et al.  Neuronlike adaptive elements that can solve difficult learning control problems , 1983, IEEE Transactions on Systems, Man, and Cybernetics.

[28]  Arend Hintze,et al.  MABE (Modular Agent Based Evolver): A framework for digital evolution research , 2017, ECAL.

[29]  Geoffrey E. Hinton,et al.  Adaptive Mixtures of Local Experts , 1991, Neural Computation.

[30]  Cullen Schaffer,et al.  A Conservation Law for Generalization Performance , 1994, ICML.

[31]  Arend Hintze,et al.  Predator confusion is sufficient to evolve swarming behaviour , 2012, Journal of The Royal Society Interface.

[32]  Kenneth O. Stanley,et al.  Exploiting Open-Endedness to Solve Problems Through the Search for Novelty , 2008, ALIFE.

[33]  Risto Miikkulainen,et al.  Evolving Neural Networks through Augmenting Topologies , 2002, Evolutionary Computation.