Learning without Coding

Iterative learning is a model of language learning from positive data, due to Wiehagen. When compared to a learner in Gold's original model of language learning from positive data, an iterative learner can be thought of as memory-limited. However, an iterative learner can memorize some input elements by coding them into the syntax of its hypotheses. A main concern of this paper is: to what extent are such coding tricks necessary? One means of preventing some such coding tricks is to require that the hypothesis space used be free of redundancy, i.e., that it be 1-1. By extending a result of Lange & Zeugmann, we show that many interesting and non-trivial classes of languages can be iteratively identified in this manner. On the other hand, we show that there exists a class of languages that cannot be iteratively identified using any 1-1 effective numbering as the hypothesis space. We also consider an iterative-like learning model in which the computational component of the learner is modeled as an enumeration operator, as opposed to a partial computable function. In this new model, there are no hypotheses, and, thus, no syntax in which the learner can encode what elements it has or has not yet seen. We show that there exists a class of languages that can be identified under this new model, but that cannot be iteratively identified. On the other hand, we show that there exists a class of languages that cannot be identified under this new model, but that can be iteratively identified using a Friedberg numbering as the hypothesis space.

[1]  Rolf Wiehagen,et al.  Polynomial-time inference of arbitrary pattern languages , 2009, New Generation Computing.

[2]  Ayumi Shinohara,et al.  Knowledge Acquisition from Amino Acid Sequences by Machine Learning System BONSAI , 1992 .

[3]  John Case,et al.  Infinitary self-reference in learning theory , 1994, J. Exp. Theor. Artif. Intell..

[4]  Jr. Hartley Rogers Theory of Recursive Functions and Effective Computability , 1969 .

[5]  Thomas Zeugmann,et al.  Incremental Learning from Positive Data , 1996, J. Comput. Syst. Sci..

[6]  Rusins Freivalds,et al.  Inductive Inference and Computable One-One Numberings , 1982, Math. Log. Q..

[7]  John Case,et al.  Strongly Non-U-Shaped Learning Results by General Techniques , 2010, COLT.

[8]  John Case,et al.  Results on memory-limited U-shaped learning , 2007, Inf. Comput..

[9]  Manuel Blum,et al.  Toward a Mathematical Theory of Inductive Inference , 1975, Inf. Control..

[10]  John Case,et al.  Optimal Language Learning , 2008, ALT.

[11]  John Case,et al.  Periodicity in generations of automata , 1974, Mathematical systems theory.

[12]  Rolf Wiehagen Limes-Erkennung rekursiver Funktionen durch spezielle Strategien , 1975, J. Inf. Process. Cybern..

[13]  Leonor Becerra-Bonache,et al.  Iterative learning of simple external contextual languages , 2010, Theor. Comput. Sci..

[14]  Rolf Wiehagen A Thesis in Inductive Inference , 1990, Nonmonotonic and Inductive Logic.

[15]  Richard M. Friedberg,et al.  Three theorems on recursive enumeration. I. Decomposition. II. Maximal set. III. Enumeration without duplication , 1958, Journal of Symbolic Logic.

[16]  Martin Kummer An Easy Priority-Free Proof of a Theorem of Friedberg , 1990, Theor. Comput. Sci..

[17]  Akihiro Yamamoto,et al.  Topological properties of concept spaces (full version) , 2010, Inf. Comput..

[18]  Dana Angluin,et al.  Finding Patterns Common to a Set of Strings , 1980, J. Comput. Syst. Sci..

[19]  Sanjay Jain,et al.  Incremental learning with temporary memory , 2010, Theor. Comput. Sci..

[20]  E. Mark Gold,et al.  Language Identification in the Limit , 1967, Inf. Control..

[21]  Thomas Zeugmann,et al.  Learning indexed families of recursive languages from positive data: A survey , 2008, Theor. Comput. Sci..

[22]  Sanjay Jain,et al.  Learning in Friedberg numberings , 2008, Inf. Comput..

[23]  Mark A. Fulk Prudence and Other Conditions on Formal Language Learning , 1990, Inf. Comput..