Ordinal Mind Change Complexity of Language Identification

The approach of ordinal mind change complexity, introduced by Freivalds and Smith, uses constructive ordinals to bound the number of mind changes made by a learning machine. This approach provides a measure of the extent to which a learning machine has to keep revising its estimate of the number of mind changes it will make before converging to a correct hypothesis for languages in the class being learned. Recently, this measure, which also suggests the difficulty of learning a class of languages, has been used to analyze the learnability of rich classes of languages. Jain and Sharma have shown that the ordinal mind change complexity for identification from positive data of languages formed by unions of up to n pattern languages is ωn. They have also shown that this bound is essential. Similar results were also established for classes definable by length-bounded elementary formal systems with up to n clauses. These later results translate to learnability of certain classes of logic programs.

[1]  Stephen Cole Kleene,et al.  On notation for ordinal numbers , 1938, Journal of Symbolic Logic.

[2]  Dana Angluin,et al.  Inductive Inference of Formal Languages from Positive Data , 1980, Inf. Control..

[3]  Thomas Zeugmann,et al.  Monotonic Versus Nonmonotonic Language Learning , 1991, Nonmonotonic and Inductive Logic.

[4]  Thomas Zeugmann,et al.  Monotonic and Dual Monotonic Language Learning , 1996, Theor. Comput. Sci..

[5]  G. Sacks Higher recursion theory , 1990 .

[6]  Paul Young,et al.  An introduction to the general theory of algorithms , 1978 .

[7]  E. Mark Gold,et al.  Language Identification in the Limit , 1967, Inf. Control..

[8]  Akihiro Yamamoto,et al.  Algorithmic Learning Theory with Elementary Formal Systems , 1992 .

[9]  Takeshi Shinohara,et al.  Rich Classes Inferable from Positive Data: Length-Bounded Elementary Formal Systems , 1994, Inf. Comput..

[10]  John Case,et al.  Comparison of Identification Criteria for Machine Inductive Inference , 1983, Theor. Comput. Sci..

[11]  Daniel N. Osherson,et al.  Systems That Learn: An Introduction to Learning Theory for Cognitive and Computer Scientists , 1990 .

[12]  Arun Sharma,et al.  On the intrinsic complexity of language identification , 1994, COLT '94.

[13]  Carl H. Smith,et al.  On the Role of Procrastination in Machine Learning , 1993, Inf. Comput..

[14]  Dana Angluin,et al.  Finding Patterns Common to a Set of Strings , 1980, J. Comput. Syst. Sci..

[15]  Jr. Hartley Rogers Theory of Recursive Functions and Effective Computability , 1969 .

[16]  Carl H. Smith,et al.  On the Complexity of Inductive Inference , 1986, Inf. Control..

[17]  Leonard Pitt,et al.  Inductive Inference, DFAs, and Computational Complexity , 1989, AII.

[18]  Keith Wright Identification of unions of languages drawn from an identifiable class , 1989, COLT '89.

[19]  Heikki Mannila,et al.  MDL learning of unions of simple pattern languages from positive examples , 1995, EuroCOLT.

[20]  Robert E. Schapire,et al.  Pattern languages are not learnable , 1990, Annual Conference Computational Learning Theory.

[21]  Takeshi Shinohara,et al.  The correct definition of finite elasticity: corrigendum to identification of unions , 1991, COLT '91.

[22]  John Case,et al.  Not-So-Nearly-Minimal-Size Program Inference , 1995, GOSLER Final Report.

[23]  Yasuhito Mukouchi,et al.  Inductive Inference of an Approximate Concept from Positive Data , 1994, AII/ALT.

[24]  篠原 武 Studies on inductive inference from positive data , 1986 .

[25]  Shyam Kapur,et al.  Monotonic Language Learning , 1992, ALT.

[26]  Thomas Zeugmann,et al.  A Guided Tour Across the Boundaries of Learning Recursive Languages , 1995, GOSLER Final Report.

[27]  Andris Ambainis The power of procrastination in inductive inference: How it depends on used ordinal notations , 1995, EuroCOLT.