An automatic tool to facilitate authoring animation blending in game engines

Achieving realistic virtual humans is crucial in virtual reality applications and video games. Nowadays there are software and game development tools, that are of great help to generate and simulate characters. They offer easy to use GUIs to create characters by dragging and drooping features, and making small modifications. Similarly, there are tools to create animation graphs and setting blending parameters among others. Unfortunately, even though these tools are relatively user friendly, achieving natural animation transitions is not straight forward and thus non-expert users tend to spend a large amount of time to generate animations that are not completely free of artefacts. In this paper we present a method to automatically generate animation blend spaces in Unreal engine, which offers two advantages: the first one is that it provides a tool to evaluate the quality of an animation set, and the second one is that the resulting graph does not depend on user skills and it is thus not prone to user errors.

[1]  Dinesh Manocha,et al.  F2FCrowds: Planning Agent Movements to Enable Face-to-Face Interactions , 2017, PRESENCE: Teleoperators and Virtual Environments.

[2]  Dinesh Manocha,et al.  Simulating Movement Interactions Between Avatars & Agents in Virtual Worlds Using Human Motion Constraints , 2018, 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR).

[3]  Norman I. Badler,et al.  Footstep parameterized motion blending using barycentric coordinates , 2015, Comput. Graph..

[4]  Alla Safonova,et al.  Achieving good connectivity in motion graphs , 2008, SCA '08.

[5]  Taesoo Kwon,et al.  Real‐time Locomotion Controller using an Inverted‐Pendulum‐based Abstract Model , 2018, Comput. Graph. Forum.

[6]  Ari Shapiro,et al.  Building a Character Animation System , 2011, MIG.

[7]  Alejandro Beacco,et al.  Avatar Locomotion in Crowd Simulation , 2011, Int. J. Virtual Real..

[8]  Glenn Reinman,et al.  Footstep navigation for dynamic crowds , 2011, SI3D.

[9]  Lucas Kovar,et al.  Motion Graphs , 2002, ACM Trans. Graph..

[10]  Tong-Yee Lee,et al.  Real-Time Physics-Based 3D Biped Character Animation Using an Inverted Pendulum Model , 2010, IEEE Transactions on Visualization and Computer Graphics.

[11]  Marcelo Kallmann,et al.  An Analysis of Motion Blending Techniques , 2012, MIG.

[12]  Dinesh Manocha,et al.  Walk This Way: A Lightweight, Data-Driven Walking Synthesis Algorithm , 2011, MIG.

[13]  Taku Komura,et al.  Phase-functioned neural networks for character control , 2017, ACM Trans. Graph..

[14]  Jason Gregory,et al.  Game Engine Architecture , 2009 .

[15]  Taku Komura,et al.  Mode-adaptive neural networks for quadruped motion control , 2018, ACM Trans. Graph..

[16]  趙 嘉立,et al.  「Unreal Engine 4」爆発エフェクトの作り方 , 2019 .