Optimization-based interactive motion synthesis for virtual characters

Modeling the reactions of human characters to a dynamic environment is crucial for achieving perceptual immersion in applications such as video games, training simulations and movies. Virtual characters in these applications need to realistically react to environmental events and precisely follow high-level user commands. Most existing physics engines for computer animation facilitate synthesis of passive motion, but remain unsuccessful in generating motion that requires active control, such as character animation. We present an optimization-based approach to synthesizing active motion for articulated characters, emphasizing both physical realism and user controllability. At each time step, we optimize the motion based on a set of goals specified by higher-level decision makers, subject to the Lagrangian dynamics and the physical limitations of the character. Our framework represents each decision maker as a controller.