An EM approach to grammatical inference: input/output HMMs

Proposes a modular recurrent connectionist architecture for adaptive temporal processing. The model is given, a probabilistic interpretation and is trained using the estimation-maximisation (EM) algorithm. This model can also be seen as an input/output hidden Markov model. The focus of this paper is on sequence classification tasks. The authors demonstrate that EM supervised learning is well suited for solving grammatical inference problems. Experimental benchmark results are presented for the seven Tomita grammars, showing that these adaptive models can, attain excellent generalization.