Coarse-grained and emergent distributed parameter systems from data

We explore the derivation of distributed parameter system evolution laws (and in particular, partial differential operators and associated partial differential equations, PDEs) from spatiotemporal data. This is, of course, a classical identification problem; our focus here is on the use of manifold learning techniques (and, in particular, variations of Diffusion Maps) in conjunction with neural network learning algorithms that allow us to attempt this task when the dependent variables, and even the independent variables of the PDE are not known a priori and must be themselves derived from the data. The similarity measure used in Diffusion Maps for dependent coarse variable detection involves distances between local particle distribution observations; for independent variable detection we use distances between local short-time dynamics. We demonstrate each approach through an illustrative established PDE example. Such variable-free, emergent space identification algorithms connect naturally with equation-free multiscale computation tools.

[1]  Katharina Krischer,et al.  An Emergent Space for Distributed Data With Hidden Internal Order Through Manifold Learning , 2017, IEEE Access.

[2]  I. Kevrekidis,et al.  Coarse-scale PDEs from fine-scale observations via machine learning , 2019, Chaos.

[3]  Ioannis G. Kevrekidis,et al.  Identification of distributed parameter systems: A neural net based approach , 1998 .

[4]  P. Cochat,et al.  Et al , 2008, Archives de pediatrie : organe officiel de la Societe francaise de pediatrie.

[5]  Katharina Krischer,et al.  The complex Ginzburg–Landau equation: an introduction , 2012 .

[6]  Paris Perdikaris,et al.  Numerical Gaussian Processes for Time-Dependent and Nonlinear Partial Differential Equations , 2017, SIAM J. Sci. Comput..

[7]  Desmond J. Higham,et al.  An Algorithmic Introduction to Numerical Simulation of Stochastic Differential Equations , 2001, SIAM Rev..

[8]  Yoshua Bengio,et al.  Generative Adversarial Nets , 2014, NIPS.

[9]  Natalia Gimelshein,et al.  PyTorch: An Imperative Style, High-Performance Deep Learning Library , 2019, NeurIPS.

[10]  I. Kevrekidis,et al.  Linking Machine Learning with Multiscale Numerics: Data-Driven Discovery of Homogenized Equations , 2020, JOM.

[11]  Jaideep Pathak,et al.  Model-Free Prediction of Large Spatiotemporally Chaotic Systems from Data: A Reservoir Computing Approach. , 2018, Physical review letters.

[12]  C. W. Gear,et al.  Equation-Free, Coarse-Grained Multiscale Computation: Enabling Mocroscopic Simulators to Perform System-Level Analysis , 2003 .

[13]  S. Cox,et al.  Exponential Time Differencing for Stiff Systems , 2002 .

[14]  Chi-Wang Shu,et al.  High Order Weighted Essentially Nonoscillatory Schemes for Convection Dominated Problems , 2009, SIAM Rev..

[15]  Petros Koumoutsakos,et al.  Data-driven forecasting of high-dimensional chaotic systems with long short-term memory networks , 2018, Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences.

[16]  Ioannis G. Kevrekidis,et al.  The gap-tooth method in particle simulations , 2003 .

[17]  Allan Gut,et al.  The Moment Problem , 2002, Encyclopedia of Special Functions: The Askey-Bateman Project.

[18]  Stéphane Lafon,et al.  Diffusion maps , 2006 .

[19]  R. Coifman,et al.  Geometric harmonics: A novel tool for multiscale out-of-sample extension of empirical functions , 2006 .

[20]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.