Data driven cloth animation
暂无分享,去创建一个
We present a new method for cloth animation based on data driven synthesis. In contrast to approaches that focus on physical simulation, we animate cloth by manipulating short sequences of existing cloth animation. While our source of data is cloth animation captured using video cameras ([White et al. 2007]), the method is equally applicable to simulation data. The approach has benefits in both cases: current cloth capture is limited because small tweaks to the data require filming an entirely new sequence. Likewise, simulation suffers from long computation times and complications such as tangling. In this sketch we create new animations by fitting cloth animation to human motion capture data, i.e., we drive the cloth with a skeleton.
[1] David A. Forsyth,et al. Capturing and animating occluded cloth , 2007, ACM Trans. Graph..
[2] Jovan Popović,et al. Deformation transfer for triangle meshes , 2004, SIGGRAPH 2004.