On the Relationship between Models for Learning in Helpful Environments

The PAC and other equivalent learning models are widely accepted models for polynomial learnability of concept classes. However, negative results abound in the PAC learning framework (concept classes such as deterministic finite state automata (DFA) are not efficiently learnable in the PAC model). The PAC model’s requirement of learnability under all conceivable distributions could be considered too stringent a restriction for practical applications. Several models for learning in more helpful environments have been proposed in the literature including: learning from example based queries [2], online learning allowing a bounded number of mistakes [14], learning with the help of teaching sets [7], learning from characteristic sets [5], and learning from simple examples [12,4]. Several concept classes that are not learnable in the standard PAC model have been shown to be learnable in these models. In this paper we identify the relationships between these different learning models. We also address the issue of unnatural collusion between the teacher and the learner that can potentially trivialize the task of learning in helpful environments.

[1]  Colin de la Higuera Characteristic sets for polynominal grammatical inference , 1996, ICGI.

[2]  Leonard Pitt,et al.  Reductions among prediction problems: on the difficulty of predicting automata , 1988, [1988] Proceedings. Structure in Complexity Theory Third Annual Conference.

[3]  Rémi Gilleron,et al.  PAC Learning with Simple Examples , 1996, STACS.

[4]  David Haussler,et al.  Equivalence of models for polynomial learnability , 1988, COLT '88.

[5]  William I. Gasarch,et al.  Book Review: An introduction to Kolmogorov Complexity and its Applications Second Edition, 1997 by Ming Li and Paul Vitanyi (Springer (Graduate Text Series)) , 1997, SIGACT News.

[6]  Leslie G. Valiant,et al.  A theory of the learnable , 1984, STOC '84.

[7]  Ming Li,et al.  Learning Simple Concept Under Simple Distributions , 1991, SIAM J. Comput..

[8]  Rajesh Parekh,et al.  Simple DFA are Polynomially Probably Exactly Learnable from Simple Examples , 1999, ICML.

[9]  Ming Li,et al.  An Introduction to Kolmogorov Complexity and Its Applications , 2019, Texts in Computer Science.

[10]  Andrew Tomkins,et al.  A computational model of teaching , 1992, COLT '92.

[11]  Paul M. B. Vitányi,et al.  An Introduction to Kolmogorov Complexity and Its Applications , 1993, Graduate Texts in Computer Science.

[12]  J. Oncina,et al.  INFERRING REGULAR LANGUAGES IN POLYNOMIAL UPDATED TIME , 1992 .

[13]  Jorge Castro Rabal,et al.  Query, PACS and simple-PAC learning , 1998 .

[14]  N. Littlestone Learning Quickly When Irrelevant Attributes Abound: A New Linear-Threshold Algorithm , 1987, 28th Annual Symposium on Foundations of Computer Science (sfcs 1987).

[15]  Sally A. Goldman,et al.  Teaching a Smarter Learner , 1996, J. Comput. Syst. Sci..

[16]  Leslie G. Valiant,et al.  Cryptographic Limitations on Learning Boolean Formulae and Finite Automata , 1993, Machine Learning: From Theory to Applications.

[17]  Rajesh Parekh,et al.  A Polynominal Time Incremental Algorithm for Learning DFA , 1998, ICGI.

[18]  Rajesh Parekh,et al.  A Polynomial Time Incremental Algorithm for Regular Grammar Inference , 1997 .

[19]  E. Mark Gold,et al.  Complexity of Automaton Identification from Given Data , 1978, Inf. Control..

[20]  Dana Angluin,et al.  Learning Regular Sets from Queries and Counterexamples , 1987, Inf. Comput..