Resource-bounded Dimension in Computational Learning Theory

This paper focuses on the relation between computational learning theory and resource-bounded dimension. We intend to establish close connections between the learnability/nonlearnability of a concept class and its corresponding size in terms of effective dimension, which will allow the use of powerful dimension techniques in computational learning and viceversa, the import of learning results into complexity via dimension. Firstly, we obtain a tight result on the dimension of online mistake-bound learnable classes. Secondly, in relation with PAC learning, we show that the polynomial-space dimension of PAC learnable classes of concepts is zero. This provides a hypothesis on effective dimension that implies the inherent unpredictability of concept classes (the classes that verify this property are classes not efficiently PAC learnable using any hypothesis). Thirdly, in relation to space dimension of classes that are learnable by membership query algorithms, the main result proves that polynomial-space dimension of concept classes learnable by a membership-query algorithm is zero.

[1]  Dana Angluin,et al.  Computational learning theory: survey and selected bibliography , 1992, STOC '92.

[2]  Leslie G. Valiant,et al.  Computational limitations on learning from examples , 1988, JACM.

[3]  David Haussler,et al.  Learnability and the Vapnik-Chervonenkis dimension , 1989, JACM.

[4]  David Haussler,et al.  Equivalence of models for polynomial learnability , 1988, COLT '88.

[5]  John M. Hitchcock Online Learning and Resource-Bounded Dimension: Winnow Yields New Lower Bounds for Hard Sets , 2007, SIAM J. Comput..

[6]  Leslie G. Valiant,et al.  A theory of the learnable , 1984, STOC '84.

[7]  Ronald L. Rivest,et al.  Training a 3-node neural network is NP-complete , 1988, COLT '88.

[8]  Dana Angluin,et al.  Queries and concept learning , 1988, Machine Learning.

[9]  N. Littlestone Learning Quickly When Irrelevant Attributes Abound: A New Linear-Threshold Algorithm , 1987, 28th Annual Symposium on Foundations of Computer Science (sfcs 1987).

[10]  Umesh V. Vazirani,et al.  An Introduction to Computational Learning Theory , 1994 .

[11]  Ronald L. Rivest,et al.  Training a 3-node neural network is NP-complete , 1988, COLT '88.

[12]  John M. Hitchcock,et al.  Dimension, Halfspaces, and the Density of Hard Sets , 2010, Theory of Computing Systems.

[13]  Ronald L. Rivest,et al.  Learning decision lists , 2004, Machine Learning.

[14]  Jack H. Lutz,et al.  The quantitative structure of exponential time , 1993, [1993] Proceedings of the Eigth Annual Structure in Complexity Theory Conference.

[15]  Andrea Sorbi,et al.  New Computational Paradigms: Changing Conceptions of What is Computable , 2007 .

[16]  David Haussler,et al.  Learning Conjunctive Concepts in Structural Domains , 1989, Machine Learning.

[17]  Elvira Mayordomo,et al.  Effective Fractal Dimension in Algorithmic Information Theory , 2008 .

[18]  Jack H. Lutz Resource-bounded measure , 1998, Proceedings. Thirteenth Annual IEEE Conference on Computational Complexity (Formerly: Structure in Complexity Theory Conference) (Cat. No.98CB36247).

[19]  Wolfgang Lindner Resource-Bounded Measure and Learnability , 2000, Theory of Computing Systems.

[20]  John M. Hitchcock Fractal dimension and logarithmic loss unpredictability , 2003, Theor. Comput. Sci..

[21]  Jack H. Lutz,et al.  Dimension in complexity classes , 2000, Proceedings 15th Annual IEEE Conference on Computational Complexity.

[22]  Jack H. Lutz,et al.  The dimensions of individual strings and sequences , 2002, Inf. Comput..