Sharpening Occam's razor

We provide a new representation-independent formulation of Occam's razor theorem, based on Kolmogorov complexity. This new formulation allows us to: - Obtain better sample complexity than both length-based [4] and VC-based[3] versions of Occam's razor theorem, in many applications. - Achieve a sharper reverse of Occam's razor theorem than that of [5]. Specifically, we weaken the assumptions made in [5] and extend the reverse to superpolynomial running times.

[1]  Robert E. Schapire,et al.  The Boosting Approach to Machine Learning An Overview , 2003 .

[2]  David Haussler,et al.  Predicting {0,1}-functions on randomly drawn points , 1988, COLT '88.

[3]  Temple F. Smith Occam's razor , 1980, Nature.

[4]  Manfred K. Warmuth Towards Representation Independence in PAC Learning , 1989, AII.

[5]  William I. Gasarch,et al.  Book Review: An introduction to Kolmogorov Complexity and its Applications Second Edition, 1997 by Ming Li and Paul Vitanyi (Springer (Graduate Text Series)) , 1997, SIGACT News.

[6]  Leonard Pitt,et al.  On the necessity of Occam algorithms , 1990, STOC '90.

[7]  Leslie G. Valiant,et al.  A theory of the learnable , 1984, STOC '84.

[8]  Ming Li,et al.  An Introduction to Kolmogorov Complexity and Its Applications , 2019, Texts in Computer Science.

[9]  David Haussler,et al.  Quantifying Inductive Bias: AI Learning Algorithms and Valiant's Learning Framework , 1988, Artif. Intell..

[10]  Manfred K. Warmuth,et al.  On Weak Learning , 1995, J. Comput. Syst. Sci..

[11]  Tao Jiang,et al.  Linear approximation of shortest superstrings , 1994, JACM.

[12]  Ming Li,et al.  Towards a DNA sequencing theory (learning a string) , 1990, Proceedings [1990] 31st Annual Symposium on Foundations of Computer Science.

[13]  David Haussler,et al.  Learnability and the Vapnik-Chervonenkis dimension , 1989, JACM.

[14]  Martin Anthony,et al.  Computational learning theory: an introduction , 1992 .

[15]  Leslie G. Valiant,et al.  A general lower bound on the number of examples needed for learning , 1988, COLT '88.