A No-Free-Lunch theorem for non-uniform distributions of target functions

The sharpened No-Free-Lunch-theorem (NFL-theorem) states that, regardless of the performance measure, the performance of all optimization algorithms averaged uniformly over any finite set F of functions is equal if and only if F is closed under permutation (c.u.p.). In this paper, we first summarize some consequences of this theorem, which have been proven recently: The number of subsets c.u.p. can be neglected compared to the total number of possible subsets. In particular, problem classes relevant in practice are not likely to be c.u.p. The average number of evaluations needed to find a desirable (e.g., optimal) solution can be calculated independent of the optimization algorithm in certain scenarios. Second, as the main result, the NFL-theorem is extended. Necessary and sufficient conditions for NFL-results to hold are given for arbitrary distributions of target functions. This yields the most general NFL-theorem for optimization presented so far.

[1]  David H. Wolpert,et al.  No free lunch theorems for optimization , 1997, IEEE Trans. Evol. Comput..

[2]  Marc Toussaint,et al.  On Classes of Functions for which No Free Lunch Results Hold , 2001, Inf. Process. Lett..

[3]  D. Wolpert,et al.  No Free Lunch Theorems for Search , 1995 .

[4]  Matthew J. Streeter,et al.  Two Broad Classes of Functions for Which a No Free Lunch Result Does Not Hold , 2003, GECCO.

[5]  Thomas Jansen,et al.  Optimization with randomized search heuristics - the (A)NFL theorem, realistic scenarios, and difficult functions , 2002, Theor. Comput. Sci..

[6]  D. WhitleyComputer A Free Lunch Proof for Gray versus Binary Encodings , 1999 .

[7]  L. D. Whitley,et al.  The No Free Lunch and problem description length , 2001 .

[8]  Thomas M. English,et al.  Evaluation of Evolutionary and Genetic Optimizers: No Free Lunch , 1996, Evolutionary Programming.

[9]  Christian Igel,et al.  Graph isomorphisms effect on structure optimization of neural networks , 2002, Proceedings of the 2002 International Joint Conference on Neural Networks. IJCNN'02 (Cat. No.02CH37290).

[10]  Marc Toussaint,et al.  Neutrality and self-adaptation , 2003, Natural Computing.

[11]  David H. Wolpert,et al.  Remarks on a recent paper on the "no free lunch" theorems , 2001, IEEE Trans. Evol. Comput..

[12]  Patrick D. Surry,et al.  Fundamental Limitations on Search Algorithms: Evolutionary Computing in Perspective , 1995, Computer Science Today.

[13]  T. M. English Optimization is easy and learning is hard in the typical function , 2000, Proceedings of the 2000 Congress on Evolutionary Computation. CEC00 (Cat. No.00TH8512).

[14]  Marc Toussaint,et al.  Recent Results on No-Free-Lunch Theorems for Optimization , 2003, ArXiv.