Generalized Physics-Informed Learning through Language-Wide Differentiable Programming

Scientific computing is increasingly incorporating the advancements in machine learning to allow for data-driven physicsinformed modeling approaches. However, re-targeting existing scientific computing workloads to machine learning frameworks is both costly and limiting, as scientific simulations tend to use the full feature set of a general purpose programming language. In this manuscript we develop an infrastructure for incorporating deep learning into existing scientific computing code through Differentiable Programming (∂P ). We describe a ∂P system that is able to take gradients of full Julia programs, making Automatic Differentiation a first class language feature and compatibility with deep learning pervasive. Our system utilizes the one-language nature of Julia package development to augment the existing package ecosystem with deep learning, supporting almost all language constructs (control flow, recursion, mutation, etc.) while generating highperformance code without requiring any user intervention or refactoring to stage computations. We showcase several examples of physics-informed learning which directly utilizes this extension to existing simulation code: neural surrogate models, machine learning on simulated quantum hardware, and data-driven stochastic dynamical model discovery with neural stochastic differential equations.

[1]  Frédo Durand,et al.  Differentiable programming for image processing and deep learning in halide , 2018, ACM Trans. Graph..

[2]  Fouad Bennis,et al.  Model reduction technique for mechanical behaviour modelling: Efficiency criteria and validity domain assessment , 2008 .

[3]  Lalit K. Mestha,et al.  A deep learning framework for model reduction of dynamical systems , 2017, 2017 IEEE Conference on Control Technology and Applications (CCTA).

[4]  David Duvenaud,et al.  FFJORD: Free-form Continuous Dynamics for Scalable Reversible Generative Models , 2018, ICLR.

[5]  Simon Portegies Zwart,et al.  Newton vs the machine: solving the chaotic three-body problem using deep neural networks , 2019, ArXiv.

[6]  Alan Edelman,et al.  Julia: A Fresh Approach to Numerical Computing , 2014, SIAM Rev..

[7]  Jonas Degrave,et al.  A DIFFERENTIABLE PHYSICS ENGINE FOR DEEP LEARNING IN ROBOTICS , 2016, Front. Neurorobot..

[8]  Hector M. Romero Ugalde,et al.  Neural network design and model reduction approach for black box nonlinear system identification with reduced number of parameters , 2013, Neurocomputing.

[9]  Yueqin Hu,et al.  Coupled latent differential equation with moderators: simulation and application. , 2014, Psychological methods.

[10]  Barak A. Pearlmutter,et al.  Reverse-mode AD in a functional framework: Lambda the ultimate backpropagator , 2008, TOPL.

[11]  John Preskill,et al.  Quantum Computing in the NISQ era and beyond , 2018, Quantum.

[12]  Tzu-Mao Li,et al.  Differentiable Visual Computing , 2019, ArXiv.

[13]  Qing Nie,et al.  DifferentialEquations.jl – A Performant and Feature-Rich Ecosystem for Solving Differential Equations in Julia , 2017, Journal of Open Research Software.

[14]  Michael Innes,et al.  Don't Unroll Adjoint: Differentiating SSA-Form Programs , 2018, ArXiv.

[15]  Neil D. Lawrence,et al.  Latent Force Models , 2009, AISTATS.

[16]  Christopher Rackauckas,et al.  ADAPTIVE METHODS FOR STOCHASTIC DIFFERENTIAL EQUATIONS VIA NATURAL EMBEDDINGS AND REJECTION SAMPLING WITH MEMORY. , 2017, Discrete and continuous dynamical systems. Series B.

[17]  David Duvenaud,et al.  Neural Ordinary Differential Equations , 2018, NeurIPS.

[18]  Neil D. Lawrence,et al.  Gaussian process modelling of latent chemical species: applications to inferring transcription factor activities , 2008, ECCB.

[19]  Mose Giordano,et al.  Uncertainty propagation with functionally correlated quantities , 2016, 1610.08716.

[20]  Jian Sun,et al.  Deep Residual Learning for Image Recognition , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[21]  Barak A. Pearlmutter,et al.  Automatic differentiation in machine learning: a survey , 2015, J. Mach. Learn. Res..

[22]  Vaibhav Dixit,et al.  DiffEqFlux.jl - A Julia Library for Neural Differential Equations , 2019, ArXiv.

[23]  Michael Innes,et al.  Fashionable Modelling with Flux , 2018, ArXiv.

[24]  Vaibhav Dixit,et al.  A Comparison of Automatic Differentiation and Continuous Sensitivity Analysis for Derivatives of Differential Equation Solutions , 2018, 2021 IEEE High Performance Extreme Computing Conference (HPEC).

[25]  Fei Wang,et al.  Demystifying differentiable programming: shift/reset the penultimate backpropagator , 2018, Proc. ACM Program. Lang..

[26]  J. Gambetta,et al.  Hardware-efficient variational quantum eigensolver for small molecules and quantum magnets , 2017, Nature.

[27]  Joshua B. Tenenbaum,et al.  End-to-End Differentiable Physics for Learning and Control , 2018, NeurIPS.

[28]  Yuan Yu,et al.  TensorFlow: A system for large-scale machine learning , 2016, OSDI.

[29]  Christopher G. Atkeson,et al.  A comparison of direct and model-based reinforcement learning , 1997, Proceedings of International Conference on Robotics and Automation.

[30]  Keisuke Fujii,et al.  Quantum circuit learning , 2018, Physical Review A.

[31]  Marcello Benedetti,et al.  Parameterized quantum circuits as machine learning models , 2019, Quantum Science and Technology.

[32]  Christian Bischof,et al.  Adifor 2.0: automatic differentiation of Fortran 77 programs , 1996 .

[33]  Elliot Saba,et al.  Automatic Full Compilation of Julia Programs and ML Models to Cloud TPUs , 2018, ArXiv.

[34]  Yee Whye Teh,et al.  Augmented Neural ODEs , 2019, NeurIPS.