Emulating human perception of motion similarity

Evaluating the similarity of motions is useful for motion retrieval, motion blending, and performance analysis of dancers and athletes. Euclidean distance between corresponding joints has been widely adopted in measuring similarity of postures and hence motions. However, such a measure does not necessarily conform to the human perception of motion similarity. In this paper, we propose a new similarity measure based on machine learning techniques. We make use of the results of questionnaires from subjects answering whether arbitrary pairs of motions appear similar or not. Using the relative distance between the joints as the basic features, we train the system to compute the similarity of arbitrary pair of motions. Experimental results show that our method outperforms methods based on Euclidean distance between corresponding joints. Our method is applicable to content‐based motion retrieval of human motion for large‐scale database systems. It is also applicable to e‐Learning systems which automatically evaluates the performance of dancers and athletes by comparing the subjects' motions with those by experts. Copyright © 2008 John Wiley & Sons, Ltd.

[1]  Authors' Biographies , 2005 .

[2]  R. Blake,et al.  Perception of human motion. , 2007, Annual review of psychology.

[3]  Taku Komura,et al.  Finding repetitive patterns in 3D human motion captured data , 2008, ICUIMC '08.

[4]  G. Johansson Visual perception of biological motion and a model for its analysis , 1973 .

[5]  Kanako Miura,et al.  Similarity of human motion: congruity between perception and data , 2006, 2006 IEEE International Conference on Systems, Man and Cybernetics.

[6]  Jessica K. Hodgins,et al.  Interactive control of avatars animated with human motion data , 2002, SIGGRAPH.

[7]  George Baciu,et al.  Entropy-based motion extraction for motion capture animation: Motion Capture and Retrieval , 2005 .

[8]  J. Cutting,et al.  Recognizing the sex of a walker from a dynamic point-light display , 1977 .

[9]  Lucas Kovar,et al.  Motion graphs , 2002, SIGGRAPH '08.

[10]  E. Rey,et al.  Shape of Motion and thePerception of Human GaitsJe , 1998 .

[11]  Ning Hu,et al.  Training for physical tasks in virtual environments: Tai Chi , 2003, IEEE Virtual Reality, 2003. Proceedings..

[12]  Taku Komura,et al.  Immersive performance training tools using motion capture technology , 2007, IMMERSCOM.

[13]  Tomomasa Sato,et al.  Quantitative evaluation method for pose and motion similarity based on human perception , 2004, 4th IEEE/RAS International Conference on Humanoid Robots, 2004..

[14]  Meinard Müller,et al.  Efficient content-based retrieval of motion capture data , 2005, SIGGRAPH '05.

[15]  Dimitrios Gunopulos,et al.  Indexing Large Human-Motion Databases , 2004, VLDB.