Falsification and Future Performance

We information-theoretically reformulate two measures of capacity from statistical learning theory: empirical VC-entropy and empirical Rademacher complexity. We show these capacity measures count the number of hypotheses about a dataset that a learning algorithm falsifies when it finds the classifier in its repertoire minimizing empirical risk. It then follows from that the future performance of predictors on unseen data is controlled in part by how many hypotheses the learner falsifies. As a corollary we show that empirical VC-entropy quantifies the message length of the true hypothesis in the optimal code of a particular probability distribution, the so-called actual repertoire.

[1]  C. S. Wallace,et al.  An Information Measure for Classification , 1968, Comput. J..

[2]  Sanjeev R. Kulkarni,et al.  Reliable Reasoning: Induction and Statistical Learning Theory , 2007 .

[3]  S. Boucheron,et al.  A sharp concentration inequality with applications , 1999, Random Struct. Algorithms.

[4]  Peter Grünwald,et al.  Invited review of the book Statistical and Inductive Inference by Minimum Message Length , 2006 .

[5]  Gábor Lugosi,et al.  Introduction to Statistical Learning Theory , 2004, Advanced Lectures on Machine Learning.

[6]  Gary James Jason,et al.  The Logic of Scientific Discovery , 1988 .

[7]  Vladimir Koltchinskii,et al.  Rademacher penalties and structural risk minimization , 2001, IEEE Trans. Inf. Theory.

[8]  David L. Dowe,et al.  MML, hybrid Bayesian network graphical models, statistical consistency, invarianc , 2010 .

[9]  Ray J. Solomonoff,et al.  A Formal Theory of Inductive Inference. Part II , 1964, Inf. Control..

[10]  Trevor Hastie,et al.  An Introduction to Statistical Learning , 2013, Springer Texts in Statistics.

[11]  Vladimir Vapnik,et al.  Estimation of Dependences Based on Empirical Data: Springer Series in Statistics (Springer Series in Statistics) , 1982 .

[12]  David L. Dowe,et al.  Minimum Message Length and Kolmogorov Complexity , 1999, Comput. J..

[13]  Bernhard Schölkopf,et al.  Falsificationism and Statistical Learning Theory: Comparing the Popper and Vapnik-Chervonenkis Dimensions , 2009 .

[14]  Giulio Tononi,et al.  Qualia: The Geometry of Integrated Information , 2009, PLoS Comput. Biol..

[15]  M. Kendall,et al.  The Logic of Scientific Discovery. , 1959 .

[16]  John Earman,et al.  Handbook of philosophy of science , 2007 .

[17]  James E. Tomberlin,et al.  On the Plurality of Worlds. , 1989 .

[18]  John Maynard Smith,et al.  The Concept of Information in Biology , 2000, Philosophy of Science.

[19]  Vladimir Vapnik,et al.  Statistical learning theory , 1998 .

[20]  Giulio Tononi,et al.  Integrated Information in Discrete Dynamical Systems: Motivation and Theoretical Framework , 2008, PLoS Comput. Biol..