Learning in Sinusoidal Spaces with Physics-Informed Neural Networks

A physics-informed neural network (PINN) uses physics-augmented loss functions, e.g., incorporating the residual term from governing differential equations, to ensure its output is consistent with fundamental physics laws. However, it turns out to be difficult to train an accurate PINN model for many problems in practice. In this paper, we address this issue through a novel perspective on the merits of learning in sinusoidal spaces with PINNs. By analyzing asymptotic behavior at model initialization, we first prove that a PINN of increasing size (i.e., width and depth) induces a bias towards flat outputs. Notably, a flat function is a trivial solution to many physics differential equations—hence, deceptively minimizing the residual term of the augmented loss while being far from the true solution. We then show that the sinusoidal mapping of inputs—in an architecture we label as sfPINN—is able to elevate output variability, thus avoiding being trapped in the deceptive local minimum. In addition, the level of variability can be effectively modulated to match high-frequency patterns in the problem at hand. A key facet of this paper is the comprehensive empirical study that demonstrates the efficacy of learning in sinusoidal spaces with PINNs for a wide range of forward and inverse modelling problems spanning multiple physics domains.

[1]  G. Karniadakis,et al.  On the Convergence of Physics Informed Neural Networks for Linear Second-Order Elliptic and Parabolic Type PDEs , 2020, Communications in Computational Physics.

[2]  Ulisses Braga-Neto,et al.  Self-Adaptive Physics-Informed Neural Networks using a Soft Attention Mechanism , 2020, ArXiv.

[3]  Navid Zobeiry,et al.  A Physics-Informed Machine Learning Approach for Solving Heat Transfer Equation in Advanced Manufacturing and Engineering Applications , 2020, Eng. Appl. Artif. Intell..

[4]  H. Tchelepi,et al.  LIMITATIONS OF PHYSICS INFORMED MACHINE LEARNING FOR NONLINEAR TWO-PHASE TRANSPORT IN POROUS MEDIA , 2020 .

[5]  Oliver Hennigh,et al.  NVIDIA SimNet^{TM}: an AI-accelerated multi-physics simulation framework , 2020, ArXiv.

[6]  Justin A. Sirignano,et al.  DGM: A deep learning algorithm for solving partial differential equations , 2017, J. Comput. Phys..

[7]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[8]  Paris Perdikaris,et al.  Understanding and mitigating gradient pathologies in physics-informed neural networks , 2020, ArXiv.

[9]  Timon Rabczuk,et al.  Transfer learning enhanced physics informed neural network for phase-field modeling of fracture , 2019, Theoretical and Applied Fracture Mechanics.

[10]  Reinhard Klein,et al.  Learning Incompressible Fluid Dynamics from Scratch - Towards Fast, Differentiable Fluid Models that Generalize , 2020, ICLR.

[11]  Jian Sun,et al.  Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification , 2015, 2015 IEEE International Conference on Computer Vision (ICCV).

[12]  Hadi Meidani,et al.  Physics-Driven Regularization of Deep Neural Networks for Enhanced Engineering Design and Analysis , 2018, J. Comput. Inf. Sci. Eng..

[13]  Yoshua Bengio,et al.  Understanding the difficulty of training deep feedforward neural networks , 2010, AISTATS.

[14]  Yoshua Bengio,et al.  Practical Recommendations for Gradient-Based Training of Deep Architectures , 2012, Neural Networks: Tricks of the Trade.

[15]  E Weinan,et al.  The Deep Ritz Method: A Deep Learning-Based Numerical Algorithm for Solving Variational Problems , 2017, Communications in Mathematics and Statistics.

[16]  George Em Karniadakis,et al.  Physics-Informed Neural Network for Ultrasound Nondestructive Quantification of Surface Breaking Cracks , 2020, Journal of Nondestructive Evaluation.

[17]  Ke Li,et al.  D3M: A Deep Domain Decomposition Method for Partial Differential Equations , 2020, IEEE Access.

[18]  Todd A. Oliver,et al.  Solving differential equations using deep neural networks , 2020, Neurocomputing.

[19]  Razvan Pascanu,et al.  On the difficulty of training recurrent neural networks , 2012, ICML.

[20]  Paris Perdikaris,et al.  Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations , 2019, J. Comput. Phys..

[21]  Gordon Wetzstein,et al.  Implicit Neural Representations with Periodic Activation Functions , 2020, NeurIPS.

[22]  Jonathan T. Barron,et al.  Fourier Features Let Networks Learn High Frequency Functions in Low Dimensional Domains , 2020, NeurIPS.

[23]  Anuj Karpatne,et al.  CoPhy-PGNN: Learning Physics-guided Neural Networks with Competing Loss Functions for Solving Eigenvalue Problems , 2020, ACM Trans. Intell. Syst. Technol..

[24]  L. Dal Negro,et al.  Physics-informed neural networks for inverse problems in nano-optics and metamaterials. , 2019, Optics express.

[25]  Luning Sun,et al.  Surrogate modeling for fluid flows based on physics-constrained deep learning without simulation data , 2019, Computer Methods in Applied Mechanics and Engineering.

[26]  P. Perdikaris,et al.  Machine learning in cardiovascular flows modeling: Predicting arterial blood pressure from non-invasive 4D flow MRI data using physics-informed neural networks , 2019 .

[27]  R. Juanes,et al.  SciANN: A Keras/TensorFlow wrapper for scientific computations and physics-informed deep learning using artificial neural networks , 2020, Computer Methods in Applied Mechanics and Engineering.

[28]  Kaj Nyström,et al.  A unified deep artificial neural network approach to partial differential equations in complex geometries , 2017, Neurocomputing.

[29]  Anuj Karpatne,et al.  Physics-guided Neural Networks (PGNN): An Application in Lake Temperature Modeling , 2017, ArXiv.

[30]  M. Raissi,et al.  A physics-informed deep learning framework for inversion and surrogate modeling in solid mechanics , 2021, Computer Methods in Applied Mechanics and Engineering.

[31]  Yew-Soon Ong,et al.  Can Transfer Neuroevolution Tractably Solve Your Differential Equations? , 2021, IEEE Computational Intelligence Magazine.

[32]  Maziar Raissi,et al.  Deep Hidden Physics Models: Deep Learning of Nonlinear Partial Differential Equations , 2018, J. Mach. Learn. Res..

[33]  Barak A. Pearlmutter,et al.  Automatic differentiation in machine learning: a survey , 2015, J. Mach. Learn. Res..

[34]  George Em Karniadakis,et al.  Hidden fluid mechanics: Learning velocity and pressure fields from flow visualizations , 2020, Science.

[35]  Michael S. Triantafyllou,et al.  Deep learning of vortex-induced vibrations , 2018, Journal of Fluid Mechanics.

[36]  Zhiping Mao,et al.  DeepXDE: A Deep Learning Library for Solving Differential Equations , 2019, AAAI Spring Symposium: MLPS.

[37]  A. Lapedes,et al.  Nonlinear signal processing using neural networks: Prediction and system modelling , 1987 .

[38]  George Em Karniadakis,et al.  NSFnets (Navier-Stokes flow nets): Physics-informed neural networks for the incompressible Navier-Stokes equations , 2020, J. Comput. Phys..

[39]  Nagiza F. Samatova,et al.  Theory-Guided Data Science: A New Paradigm for Scientific Discovery from Data , 2016, IEEE Transactions on Knowledge and Data Engineering.

[40]  Giambattista Parascandolo,et al.  Taming the waves: sine as activation function in deep neural networks , 2017 .

[41]  Hanwen Wang,et al.  On the eigenvector bias of Fourier feature networks: From regression to solving multi-scale PDEs with physics-informed neural networks , 2020, ArXiv.

[42]  Paris Perdikaris,et al.  When and why PINNs fail to train: A neural tangent kernel perspective , 2020, J. Comput. Phys..

[43]  A. Wills,et al.  Physics-informed machine learning , 2021, Nature Reviews Physics.

[44]  Yoshua Bengio,et al.  On the Spectral Bias of Neural Networks , 2018, ICML.