On the learnability of recursively enumerable languages from good examples

The present paper investigates identification of indexed families L of recursively enumerable languages from good examples. We distinguish class-preserving learning from good examples (the good examples have to be generated with respect to a hypothesis space having the same range as L) and class-comprising learning from the good examples (the good examples have to be selected with respect to a hypothesis space comprising the range of L). A learner is required to learn a target language on every finite superset of the good examples for it. If the learner's first and only conjecture is correct then the underlying learning model is referred to as finite identification from good examples and if the learner makes a finite number of incorrect conjectures before always outputting a correct one, the model is referred to as limit identification from good examples. In the context of class-preserving learning, it is shown that the learning power of finite and limit identification from good text examples coincide. When class comprising learning from good text examples is concerned, limit identification is strictly more powerful than finite learning. Furthermore, if learning from good informant examples is considered, limit identification is superior to finite identification in the class preserving as well as in the class-comprising case. Finally, we relate the models of learning from good examples to one another as well as to the standard learning models in the context of Gold-style language learnin

[1]  Manuel Blum,et al.  A Machine-Independent Theory of the Complexity of Recursive Functions , 1967, JACM.

[2]  Sally A. Goldman,et al.  Teaching a Smarter Learner , 1996, J. Comput. Syst. Sci..

[3]  John Case,et al.  Comparison of Identification Criteria for Machine Inductive Inference , 1983, Theor. Comput. Sci..

[4]  Arun Sharma,et al.  Elementary formal systems, intrinsic complexity, and procrastination , 1997, COLT '96.

[5]  Rusins Freivalds,et al.  On the Power of Inductive Inference from Good Examples , 1993, Theor. Comput. Sci..

[6]  Rolf Wiehagen,et al.  Language Learning from Good Examples , 1994, AII/ALT.

[7]  John Case,et al.  Language Learning with Some Negative Information , 1993, Journal of computer and system sciences (Print).

[8]  Dana Angluin,et al.  Inductive Inference of Formal Languages from Positive Data , 1980, Inf. Control..

[9]  Jeffrey D. Ullman,et al.  Introduction to Automata Theory, Languages and Computation , 1979 .

[10]  元木 達也 INDUCTIVE INFERENCE FROM ALL POSITIVE AND SOME NEGATIVE DATA , 1991 .

[11]  John Case,et al.  Incremental Concept Learning for Bounded Data Mining , 1997, Inf. Comput..

[12]  Dana Angluin,et al.  Finding Patterns Common to a Set of Strings , 1980, J. Comput. Syst. Sci..

[13]  Mark A. Fulk Prudence and Other Conditions on Formal Language Learning , 1990, Inf. Comput..

[14]  E. Mark Gold,et al.  Language Identification in the Limit , 1967, Inf. Control..

[15]  Manuel Blum,et al.  Toward a Mathematical Theory of Inductive Inference , 1975, Inf. Control..

[16]  Thomas Zeugmann,et al.  A Guided Tour Across the Boundaries of Learning Recursive Languages , 1995, GOSLER Final Report.

[17]  Gisela Schäfer-Richter,et al.  Über Eingabeabhängigkeit und Komplexität von Inferenzstrategien , 1984 .

[18]  Dick de Jongh,et al.  Angluin's theorem for indexed families of r.e. sets and applications , 1996, COLT '96.

[19]  John Case,et al.  Machine Inductive Inference and Language Identification , 1982, ICALP.

[20]  Jr. Hartley Rogers Theory of Recursive Functions and Effective Computability , 1969 .