Hierarchically-Consistent GA Test Problems

The Building-Block Hypothesis suggests that GAs perform well when they are able to identify above-averagefitness low-order schemata and recombine them to produce higher-order schemata of higher-fitness. We suppose that the recombinative process continues recursively: combining schemata of successively higher orders as search progresses. Historically, attempts to illustrate this intuitively straight-forward process on abstract test problems, most notably, the Royal Road problems, have been somewhat perplexing, and more recent building-block test problems have abandoned the multi-level hierarchical structure of the Royal Roads, and thus departed from the original recursive aspects of the hypothesis. This paper defines the concept of hierarchical consistency, which captures the recursive nature of problems implied by the Building-Block Hypothesis. Hierarchical consistency causes us to re-think some of the concepts of problem difficulty that have become established for example order-k delineation, and deception as defined with respect to a single global-optimum. We introduce several variants of problems that are hierarchically consistent, and discuss the concepts of problem difficulty with respect to these models. Experimental results begin to explore the effects that these variations have on GA performance.

[1]  Jordan B. Pollack,et al.  Modeling Building-Block Interdependency , 1998, PPSN.

[2]  Terry Jones,et al.  Fitness Distance Correlation as a Measure of Problem Difficulty for Genetic Algorithms , 1995, ICGA.

[3]  Lee Altenberg,et al.  The Schema Theorem and Price's Theorem , 1994, FOGA.

[4]  David E. Goldberg,et al.  Learning Linkage , 1996, FOGA.

[5]  Jordan B. Pollack,et al.  Incremental commitment in genetic algorithms , 1999 .

[6]  David E. Goldberg,et al.  Genetic Algorithms in Search Optimization and Machine Learning , 1988 .

[7]  David E. Goldberg,et al.  Genetic Algorithm Difficulty and the Modality of Fitness Landscapes , 1994, FOGA.

[8]  John H. Holland,et al.  Adaptation in Natural and Artificial Systems: An Introductory Analysis with Applications to Biology, Control, and Artificial Intelligence , 1992 .

[9]  Melanie Mitchell,et al.  What Makes a Problem Hard for a Genetic Algorithm? Some Anomalous Results and Their Explanation , 2004, Machine Learning.

[10]  John H. Holland,et al.  When will a Genetic Algorithm Outperform Hill Climbing , 1993, NIPS.

[11]  Kalyanmoy Deb,et al.  Sufficient conditions for deceptive and easy binary functions , 1994, Annals of Mathematics and Artificial Intelligence.

[12]  G. Harik Learning gene linkage to efficiently solve problems of bounded difficulty using genetic algorithms , 1997 .

[13]  Hillol Kargupta Gene expression: The missing link in evolutionary computation , 1997 .

[14]  L. Darrell Whitley,et al.  Building Better Test Functions , 1995, ICGA.

[15]  Kalyanmoy Deb,et al.  Analyzing Deception in Trap Functions , 1992, FOGA.

[16]  Kalyanmoy Deb,et al.  Messy Genetic Algorithms: Motivation, Analysis, and First Results , 1989, Complex Syst..

[17]  Melanie Mitchell,et al.  The royal road for genetic algorithms: Fitness landscapes and GA performance , 1991 .

[18]  C. Sparrow The Fractal Geometry of Nature , 1984 .

[19]  Melanie Mitchell,et al.  Relative Building-Block Fitness and the Building Block Hypothesis , 1992, FOGA.

[20]  Tim Jones Evolutionary Algorithms, Fitness Landscapes and Search , 1995 .

[21]  Kalyanmoy Deb,et al.  An Investigation of Niche and Species Formation in Genetic Function Optimization , 1989, ICGA.

[22]  Zbigniew Michalewicz,et al.  Genetic Algorithms + Data Structures = Evolution Programs , 1992, Artificial Intelligence.