Neural Networks with Inputs Based on Domain of Dependence and A Converging Sequence for Solving Conservation Laws, Part I: 1D Riemann Problems

Recent research works for solving partial differential equations (PDEs) with deep neural networks (DNNs) have demonstrated that spatiotemporal function approximators defined by auto-differentiation are effective for approximating nonlinear problems, e.g. the Burger’s equation, heat conduction equations, Allen-Cahn and other reaction-diffusion equations, and Navier-Stokes equation. Meanwhile, researchers apply automatic differentiation in physics-informed neural network (PINN) to solve nonlinear hyperbolic systems based on conservation laws with highly discontinuous transition, such as Riemann problem, by inverse problem formulation in data-driven approach. However, it remains a challenge for forward methods using DNNs without knowing part of the solution to resolve discontinuities in nonlinear conservation laws. In this study, we incorporate 1st order numerical schemes into DNNs to set up the loss functional approximator instead of auto-differentiation from traditional deep learning framework, e.g. TensorFlow package, which improves the effectiveness of capturing discontinuities in Riemann problems. In particular, the 2-Coarse-Grid neural network (2CGNN) and 2-Diffusion-Coefficient neural network (2DCNN) are introduced in this work. We use 2 solutions of a conservation law from a converging sequence, computed from a low-cost numerical scheme, and in a domain of dependence of a space-time grid point as the input for a neural network to predict its high-fidelity solution at the grid point. Despite smeared input solutions, they output sharp approximations to solutions containing shocks and contacts and are efficient to use once trained.

[1]  Kiriakos N. Kutulakos,et al.  A Theory of Fermat Paths for Non-Line-Of-Sight Shape Reconstruction , 2019, 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).

[2]  Martín Abadi,et al.  TensorFlow: Large-Scale Machine Learning on Heterogeneous Distributed Systems , 2016, ArXiv.

[3]  George Em Karniadakis,et al.  Hidden fluid mechanics: Learning velocity and pressure fields from flow visualizations , 2020, Science.

[4]  Michael S. Triantafyllou,et al.  Deep learning of vortex-induced vibrations , 2018, Journal of Fluid Mechanics.

[5]  C. Meneveau,et al.  Two-point stress–strain-rate correlation structure and non-local eddy viscosity in turbulent flows , 2020, Journal of Fluid Mechanics.

[6]  Matthew Gombolay,et al.  Learning from Suboptimal Demonstration via Self-Supervised Reward Regression , 2020, ArXiv.

[7]  Todd A. Oliver,et al.  Solving differential equations using deep neural networks , 2020, Neurocomputing.

[8]  Vigor Yang,et al.  Modeling of supercritical vaporization, mixing, and combustion processes in liquid-fueled propulsion systems , 2000 .

[9]  Alison M Darcy,et al.  Machine Learning and the Profession of Medicine. , 2016, JAMA.

[10]  R. Teyssier,et al.  Numerical Methods for Simulating Star Formation , 2019, Front. Astron. Space Sci..

[11]  Luning Sun,et al.  PhyGeoNet: Physics-informed geometry-adaptive convolutional neural networks for solving parameterized steady-state PDEs on irregular domain , 2021, J. Comput. Phys..

[12]  Chi-Wang Shu,et al.  Central Discontinuous Galerkin Methods on Overlapping Cells with a Nonoscillatory Hierarchical Reconstruction , 2007, SIAM J. Numer. Anal..

[13]  B. V. Leer,et al.  Towards the ultimate conservative difference scheme V. A second-order sequel to Godunov's method , 1979 .

[14]  George Em Karniadakis,et al.  NSFnets (Navier-Stokes flow nets): Physics-informed neural networks for the incompressible Navier-Stokes equations , 2020, J. Comput. Phys..

[15]  Hessam Babaee,et al.  Deep Learning of Turbulent Scalar Mixing , 2018, Physical Review Fluids.

[16]  Chi-Wang Shu,et al.  Efficient Implementation of Weighted ENO Schemes , 1995 .

[17]  L. Dal Negro,et al.  Physics-informed neural networks for inverse problems in nano-optics and metamaterials. , 2019, Optics express.

[18]  Barak A. Pearlmutter,et al.  Automatic differentiation in machine learning: a survey , 2015, J. Mach. Learn. Res..

[19]  Vigor Yang,et al.  Supercritical Mixing and Combustion of Liquid-Oxygen/ Kerosene Bi-Swirl Injectors , 2017 .

[20]  G. Karniadakis,et al.  Physics-informed neural networks for high-speed flows , 2020, Computer Methods in Applied Mechanics and Engineering.

[21]  V. Yang,et al.  Subgrid scale modeling considerations for large eddy simulation of supercritical turbulent mixing and combustion , 2021, Physics of Fluids.

[22]  Geoffrey E. Hinton,et al.  ImageNet classification with deep convolutional neural networks , 2012, Commun. ACM.

[23]  Paris Perdikaris,et al.  Numerical Gaussian Processes for Time-Dependent and Nonlinear Partial Differential Equations , 2017, SIAM J. Sci. Comput..

[24]  Paris Perdikaris,et al.  Inferring solutions of differential equations using noisy multi-fidelity data , 2016, J. Comput. Phys..

[25]  V. Yang,et al.  Common kernel-smoothed proper orthogonal decomposition (CKSPOD): An efficient reduced-order model for emulation of spatiotemporally evolving flow dynamics , 2021, 2101.08893.

[26]  B. V. Leer,et al.  Towards the Ultimate Conservative Difference Scheme , 1997 .

[27]  Richard Tsai,et al.  A stable parareal-like method for the second order wave equation , 2020, J. Comput. Phys..

[28]  V. R. Joseph,et al.  An Efficient Surrogate Model for Emulation and Physics Extraction of Large Eddy Simulations , 2016, Journal of the American Statistical Association.

[29]  Eitan Tadmor,et al.  Non-Oscillatory Hierarchical Reconstruction for Central and Finite Volume Schemes , 2006 .

[30]  A. Wills,et al.  Physics-informed machine learning , 2021, Nature Reviews Physics.

[31]  S. Osher,et al.  Weighted essentially non-oscillatory schemes , 1994 .

[32]  A. Harten ENO schemes with subcell resolution , 1989 .

[33]  G. Fitzgerald,et al.  'I. , 2019, Australian journal of primary health.

[34]  S. Osher,et al.  Uniformly high order accuracy essentially non-oscillatory schemes III , 1987 .

[35]  W. Hartt,et al.  Data-driven physics-informed constitutive metamodeling of complex fluids: A multifidelity neural network (MFNN) framework , 2021 .

[36]  Zhiliang Xu,et al.  Hierarchical reconstruction for discontinuous Galerkin methods on unstructured grids with a WENO-type linear reconstruction and partial neighboring cells , 2009, J. Comput. Phys..

[37]  G. Karniadakis,et al.  Physics-Informed Neural Networks for Heat Transfer Problems , 2021, Journal of Heat Transfer.

[38]  ShuChi-Wang,et al.  Efficient implementation of essentially non-oscillatory shock-capturing schemes, II , 1989 .

[39]  Benjamin Peherstorfer,et al.  Survey of multifidelity methods in uncertainty propagation, inference, and optimization , 2018, SIAM Rev..

[40]  V. Yang,et al.  Kernel-Smoothed Proper Orthogonal Decomposition–Based Emulation for Spatiotemporally Evolving Flow Dynamics Prediction , 2019 .

[41]  George E. Karniadakis,et al.  DeepM&Mnet for hypersonics: Predicting the coupled flow and finite-rate chemistry behind a normal shock using neural-network approximation of operators , 2020, J. Comput. Phys..

[42]  Yan Wang,et al.  Multi-Fidelity Physics-Constrained Neural Network and Its Application in Materials Modeling , 2019, Journal of Mechanical Design.

[43]  Wen Shen,et al.  Optimal Tracing of Viscous Shocks in Solutions of Viscous Conservation Laws , 2006, SIAM J. Math. Anal..

[44]  Paris Perdikaris,et al.  Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations , 2019, J. Comput. Phys..

[45]  Tsuyoshi Murata,et al.  {m , 1934, ACML.

[46]  Kurt Hornik,et al.  Multilayer feedforward networks are universal approximators , 1989, Neural Networks.

[47]  George Em Karniadakis,et al.  Physics-informed neural networks for solving forward and inverse flow problems via the Boltzmann-BGK formulation , 2020, J. Comput. Phys..