Variational integrator graph networks for learning energy-conserving dynamical systems.

Recent advances show that neural networks embedded with physics-informed priors significantly outperform vanilla neural networks in learning and predicting the long-term dynamics of complex physical systems from noisy data. Despite this success, there has only been a limited study on how to optimally combine physics priors to improve predictive performance. To tackle this problem we unpack and generalize recent innovations into individual inductive bias segments. As such, we are able to systematically investigate all possible combinations of inductive biases of which existing methods are a natural subset. Using this framework we introduce variational integrator graph networks-a novel method that unifies the strengths of existing approaches by combining an energy constraint, high-order symplectic variational integrators, and graph neural networks. We demonstrate, across an extensive ablation, that the proposed unifying framework outperforms existing methods, for data-efficient learning and in predictive accuracy, across both single- and many-body problems studied in the recent literature. We empirically show that the improvements arise because high-order variational integrators combined with a potential energy constraint induce coupled learning of generalized position and momentum updates which can be formalized via the partitioned Runge-Kutta method.