Data-driven human motion synthesis based on angular momentum analysis

In this paper, we present a novel method for realtime synthesis of human motion under external perturbations. The proposed method is data-driven and based on angular momentum analysis. When an external force is applied on the virtual human body, we analyze the change in the joints' angular momentums in a short period of time, predict the human body response, find an appropriate motion sequence from the pre-built motion capture (MoCap) database, and make a smooth transition between the current and the retrieved motion sequences to obtain the synthesized motion. The most important contributions of our method include that we propose a complete momentum analysis solution for the human body and that we make effective MoCap data organization based on the major characteristics of the body motion and the external force. As a result, realistic and real-time human motion synthesis is achieved, as experimentally demonstrated with the walking, the running and the jumping sequences.