The possibility of storing large quantities of human motion capture, or mocap, has resulted in a growing demand for content-based retrieval of motion sequences without using annotations or other meta-data. We address the issue of rapidly retrieving perceptually similar occurrences of a particular motion in a long mocap sequence or unstructured mocap database for the purpose of replicating editing operations with minimal user-input. One or more editing operations on a given motion are made to affect all similar matching motions. This general approach is applied to standard mocap editing operations such as timewarping, filtering or motion-warping. The style of interaction lies between automation and complete user control. Unlike recent mocap synthesis systems [1], where new motion is generated by searching for plausible transitions between motion segments, our method efficiently searches for similar motions using a query-by-example paradigm, while still allowing for extensive parameterization over the nature of the matching.
[1]
Antonin Guttman,et al.
R-trees: a dynamic index structure for spatial searching
,
1984,
SIGMOD '84.
[2]
Eamonn J. Keogh,et al.
Exact indexing of dynamic time warping
,
2002,
Knowledge and Information Systems.
[3]
Christos Faloutsos,et al.
Efficient Similarity Search In Sequence Databases
,
1993,
FODO.
[4]
Okan Arikan,et al.
Interactive motion generation from examples
,
2002,
ACM Trans. Graph..
[5]
Norman I. Badler,et al.
The EMOTE model for effort and shape
,
2000,
SIGGRAPH.
[6]
Dimitrios Gunopulos,et al.
Discovering similar multidimensional trajectories
,
2002,
Proceedings 18th International Conference on Data Engineering.
[7]
Christian Böhm,et al.
Searching in high-dimensional spaces: Index structures for improving the performance of multimedia databases
,
2001,
CSUR.
[8]
Setsuo Ohsuga,et al.
INTERNATIONAL CONFERENCE ON VERY LARGE DATA BASES
,
1977
.