A 3D Dynamic Model of Human Actions for Probabilistic Image Tracking

In this paper we present a method suitable to be used for human tracking as a temporal prior in a particle filtering framework such as CONDENSATION [5]. This method is for predicting feasible human postures given a reduced set of previous postures and will drastically reduce the number of particles needed to track a generic high-articulated object. Given a sequence of preceding postures, this example-driven transition model probabilistically matches the most likely postures from a database of human actions. Each action of the database is defined within a PCA-like space called UaSpace suitable to perform the probabilistic match when searching for similar sequences. So different, but feasible postures of the database become the new predicted poses.