Learning Deterministic even Linear Languages From Positive Examples

Abstract We consider the problem of learning deterministic even linear languages from positive examples. We show that, for any nonnegative integer k , the class of LR ( k ) even linear languages is not learnable from positive examples while there is a subclass called LRS ( k ), which is a natural subclass of LR ( k ) in the strong sense , learnable from positive examples. Our learning algorithm identifies this subclass in the limit with almost linear time in updating conjectures. As a corollary, in terms of even linear grammars, we have a learning algorithm for k -reversible languages that is more efficient than the one proposed by Angluin.