Imitation learning of non-linear point-to-point robot motions using dirichlet processes

In this paper we discuss the use of the infinite Gaussian mixture model and Dirichlet processes for learning robot movements from demonstrations. Starting point of this work is an earlier paper where the authors learn a non-linear dynamic robot movement model from a small number of observations. The model in that work is learned using a classical finite Gaussian mixture model (FGMM) where the Gaussian mixtures are appropriately constrained. The problem with this approach is that one needs to make a good guess for how many mixtures the FGMM should use. In this work, we generalize this approach to use an infinite Gaussian mixture model (IGMM) which does not have this limitation. Instead, the IGMM automatically finds the number of mixtures that are necessary to reflect the data complexity. For use in the context of a non-linear dynamic model, we develop a Constrained IGMM (CIGMM). We validate our algorithm on the same data that was used in [5], where the authors use motion capture devices to record the demonstrations. As further validation we test our approach on novel data acquired on our iCub in a different demonstration scenario in which the robot is physically driven by the human demonstrator.