A Comparison Framework for Walking Performances using aSpaces

In this paper, we address the analysis of human actions by comparing different performances of the same action executed by different actors. Specifically, we present a comparison procedure applied to the walking action, but the scheme can be applied to other different actions, such as bending, running, etc. To achieve fair comparison results, we define a novel human body model based on joint angles, which maximizes the differences between human postures and, moreover, reflects the anatomical structure of human beings. Subsequently, a human action space, called aSpace, is built in order to represent each performance (i.e., each predefined sequence of postures) as a parametric manifold. The final human action representation is called p-action, which is based on the most characteristic human body postures found during several walking performances. These postures are found automatically by means of a predefined distance function, and they are called key-frames. By using key-frames, we synchronize any performance with respect to the p-action. Furthermore, by considering an arc length parameterization, independence from the speed at which performances are played is attained. As a result, the style of human walking can be successfully analysed by establishing the differences of the joints between female and male walkers.

[1]  Ken-ichi Anjyo,et al.  Fourier principles for emotion-based human figure animation , 1995, SIGGRAPH.

[2]  Michael Gleicher,et al.  Retargetting motion to new characters , 1998, SIGGRAPH.

[3]  Dana H. Ballard,et al.  Computer Vision , 1982 .

[4]  Michael Gleicher,et al.  Comparing Constraint-Based Motion Editing Methods , 2001, Graph. Model..

[5]  Hiroshi Murase,et al.  Visual learning and recognition of 3-d objects from appearance , 2005, International Journal of Computer Vision.

[6]  Zoran Popovic,et al.  Physically based motion transformation , 1999, SIGGRAPH.

[7]  Norman I. Badler,et al.  Pedestrians: creating agent behaviors through statistical analysis of observation data , 2001, Proceedings Computer Animation 2001. Fourteenth Conference on Computer Animation (Cat. No.01TH8596).

[8]  William H. Press,et al.  Numerical recipes in C , 2002 .

[9]  Norman I. Badler,et al.  Simulating humans: computer graphics animation and control , 1993 .

[10]  Daniel Thalmann,et al.  PCA-based walking engine using motion capture data , 2004 .

[11]  N. Troje Decomposing biological motion: a framework for analysis and synthesis of human gait patterns. , 2002, Journal of vision.

[12]  R. Bowden Learning Statistical Models of Human Motion , 2000 .

[13]  Daniel Thalmann,et al.  A robust approach for the center of mass position control with inverse kinetics , 1996 .

[14]  Richard E. Parent,et al.  Computing the arc length of parametric curves , 1990, IEEE Computer Graphics and Applications.

[15]  Ken Perlin,et al.  Improv: a system for scripting interactive actors in virtual worlds , 1996, SIGGRAPH.

[16]  Xavier Varona,et al.  aSpaces : Action Spaces for Recognition and Synthesis of Human Actions , 2002, AMDO.

[17]  David C. Hogg,et al.  Extending the Point Distribution Model Using Polar Coordinates , 1995, CAIP.

[18]  Daniel Thalmann,et al.  A global human walking model with real-time kinematic personification , 1990, The Visual Computer.

[19]  Michael Gleicher,et al.  Evaluating video-based motion capture , 2002, Proceedings of Computer Animation 2002 (CA 2002).

[20]  Lucas Paletta,et al.  Appearance-based active object recognition , 2000, Image Vis. Comput..