How an optimal observer can smooth a landscape

Most metaheuristics try to find a good balance between exploitation and exploration to achieve their goals. The exploration efficiency is highly dependent on the size and ruggedness of the search space. A metaheuristic such as the simple genetic algorithm (SGA) is not totally suited to traverse very large landscapes, especially deceptive ones. The approach proposed here improves the exploration of the SGA by adding a second search process through the way solutions are coded. We will designate by "observer" each possible coding that aims at reducing the search space. Information on the quality and adequacy of one possible observer is obtained by adopting this specific coding and testing how SGA benefits from this search space reduction. The new approach investigated here trains the observers for a specific time and then decides which of them is the most suitable to solve the whole problem. Concretely, a second evolutionary stage has been added to evolve observers for the SGA. These observers aim at reducing the size and smoothing the ruggedness of the search space through a simplification of the genotype. To test the proposed approach, we apply it to the classical hierarchical if-and-only-if (HIFF) problem and measure the efficiency of our approach in term of quality and time.

[1]  M Mitchell,et al.  The evolution of emergent computation. , 1995, Proceedings of the National Academy of Sciences of the United States of America.

[2]  Dirk Thierens,et al.  On the complexity of hierarchical problem solving , 2005, GECCO '05.

[3]  J. Pollack,et al.  Compositional evolution: interdisciplinary investigations in evolvability, modularity, and symbiosis , 2002 .

[4]  Jordan B. Pollack,et al.  Modeling Building-Block Interdependency , 1998, PPSN.

[5]  Luc Steels,et al.  Towards a theory of emergent functionality , 1991 .

[6]  Tom Lenaerts,et al.  Transition models as an incremental approach for problem solving in evolutionary algorithms , 2005, GECCO '05.

[7]  Dorothea Heiss-Czedik,et al.  An Introduction to Genetic Algorithms. , 1997, Artificial Life.

[8]  Dirk Thierens,et al.  Hierarchical Genetic Algorithms , 2004, PPSN.

[9]  David E. Goldberg,et al.  Genetic Algorithms in Search Optimization and Machine Learning , 1988 .

[10]  Hugues Bersini,et al.  CoEvolution of Effective Observers and Observed Multi-agents System , 2005, ECAL.

[11]  Melanie Mitchell,et al.  Relative Building-Block Fitness and the Building Block Hypothesis , 1992, FOGA.

[12]  J. Pollack,et al.  A computational model of symbiotic composition in evolutionary transitions. , 2003, Bio Systems.

[13]  Phil Husbands,et al.  Whatever Emerges should be Intrinsically Useful , 2004 .

[14]  J. Crutchfield Is anything ever new?: considering emergence , 1999 .

[15]  Melanie Mitchell,et al.  What makes a problem hard for a genetic algorithm? Some anomalous results and their explanation , 1993, Machine Learning.

[16]  Christian Blum,et al.  Metaheuristics in combinatorial optimization: Overview and conceptual comparison , 2003, CSUR.

[17]  Michael D. Vose,et al.  The simple genetic algorithm - foundations and theory , 1999, Complex adaptive systems.

[18]  Melanie Mitchell,et al.  The royal road for genetic algorithms: Fitness landscapes and GA performance , 1991 .

[19]  Hugues Bersini,et al.  Intrinsic emergence boosts adaptive capacity , 2005, GECCO '05.

[20]  Marc Toussaint,et al.  Compact Genetic Codes as a Search Strategy of Evolutionary Processes , 2005, FOGA.

[21]  Tom Lenaerts,et al.  Evolutionary Transitions as a Metaphor for Evolutionary Optimisation , 2005, ECAL.