Kolmogorov complexity, Optimization and Hardness

The Kolmogorov complexity (KC) of a string is defined as the length of the shortest program that can print that string and halts. This measure of complexity is often used in optimization to indicate expected function difficulty. While it is often used, there are known counterexamples. This paper investigates the applicability of KC as an estimator of problem difficulty for optimization in the black box scenario. In particular we address the known counterexamples (e.g., pseudorandom functions, the NIAH) and explore the connection of KC to the NFLTs. We conclude that high KC implies hardness however, while easy fitness functions have low KC the reverse is not necessarily true.

[1]  T. M. English Optimization is easy and learning is hard in the typical function , 2000, Proceedings of the 2000 Congress on Evolutionary Computation. CEC00 (Cat. No.00TH8512).

[2]  Osamu Watanabe,et al.  Kolmogorov Complexity and Computational Complexity , 2012, EATCS Monographs on Theoretical Computer Science.

[3]  Thomas Jansen,et al.  Optimization with randomized search heuristics - the (A)NFL theorem, realistic scenarios, and difficult functions , 2002, Theor. Comput. Sci..

[4]  Thomas Jansen,et al.  UNIVERSITY OF DORTMUND REIHE COMPUTATIONAL INTELLIGENCE COLLABORATIVE RESEARCH CENTER 531 Design and Management of Complex Technical Processes and Systems by means of Computational Intelligence Methods Upper and Lower Bounds for Randomized Search Heuristics in Black-Box Optimization , 2004 .

[5]  László Lovász,et al.  INFORMATION AND COMPLEXITY ( HOW TO MEASURE THEM ? ) , 2022 .

[6]  Thomas M. English Practical Implications of New Results in Conservation of Optimizer Performance , 2000, PPSN.

[7]  Ingo Wegener,et al.  Towards a Theory of Randomized Search Heuristics , 2003, MFCS.

[8]  Paul M. B. Vitányi,et al.  Shannon Information and Kolmogorov Complexity , 2004, ArXiv.

[9]  Gerhard J. Woeginger,et al.  Automata, Languages and Programming , 2003, Lecture Notes in Computer Science.

[10]  Riccardo Poli,et al.  Information landscapes , 2005, GECCO '05.

[11]  William H. Hsu,et al.  GA-Hardness Revisited , 2003, GECCO.

[12]  Christian Blum,et al.  Metaheuristics in combinatorial optimization: Overview and conceptual comparison , 2003, CSUR.

[13]  Peter Grünwald,et al.  A tutorial introduction to the minimum description length principle , 2004, ArXiv.

[14]  Osamu Watanabe,et al.  Instance complexity , 1994, JACM.

[15]  Pablo Moscato,et al.  Handbook of Applied Optimization , 2000 .

[16]  Nikolai K. Vereshchagin,et al.  Kolmogorov's structure functions and model selection , 2002, IEEE Transactions on Information Theory.

[17]  David H. Wolpert,et al.  No free lunch theorems for optimization , 1997, IEEE Trans. Evol. Comput..

[18]  Y. Ho,et al.  Simple Explanation of the No-Free-Lunch Theorem and Its Implications , 2002 .

[19]  L. D. Whitley,et al.  The No Free Lunch and problem description length , 2001 .

[20]  Oded Goldreich,et al.  Foundations of Cryptography - A Primer , 2005, Found. Trends Theor. Comput. Sci..

[21]  Matthew J. Streeter,et al.  Two Broad Classes of Functions for Which a No Free Lunch Result Does Not Hold , 2003, GECCO.

[22]  Marcus Hutter,et al.  Algorithmic Information Theory , 1977, IBM J. Res. Dev..