Fast Parameter Inference on Pulsar Timing Arrays with Normalizing Flows

Pulsar timing arrays (PTAs) perform Bayesian posterior inference with expensive MCMC methods. Given a dataset of ~10-100 pulsars and O(10^3) timing residuals each, producing a posterior distribution for the stochastic gravitational wave background (SGWB) can take days to a week. The computational bottleneck arises because the likelihood evaluation required for MCMC is extremely costly when considering the dimensionality of the search space. Fortunately, generating simulated data is fast, so modern simulation-based inference techniques can be brought to bear on the problem. In this paper, we demonstrate how conditional normalizing flows trained on simulated data can be used for extremely fast and accurate estimation of the SGWB posteriors, reducing the sampling time from weeks to a matter of seconds.

[1]  B. C. Joshi,et al.  The second data release from the European Pulsar Timing Array , 2023, Astronomy & Astrophysics.

[2]  B. C. Joshi,et al.  The second data release from the European Pulsar Timing Array. III. Search for gravitational wave signals , 2023, Astronomy & Astrophysics.

[3]  G. Papamakarios,et al.  nflows: normalizing flows in PyTorch , 2020 .

[4]  Abhishek Kumar,et al.  Score-Based Generative Modeling through Stochastic Differential Equations , 2020, ICLR.

[5]  Pieter Abbeel,et al.  Denoising Diffusion Probabilistic Models , 2020, NeurIPS.

[6]  Stefano Ermon,et al.  Improved Techniques for Training Score-Based Generative Models , 2020, NeurIPS.

[7]  Eric Nalisnick,et al.  Normalizing Flows for Probabilistic Modeling and Inference , 2019, J. Mach. Learn. Res..

[8]  Natalia Gimelshein,et al.  PyTorch: An Imperative Style, High-Performance Deep Learning Library , 2019, NeurIPS.

[9]  Stephen Taylor,et al.  ENTERPRISE: Enhanced Numerical Toolbox Enabling a Robust PulsaR Inference SuitE , 2019 .

[10]  Jianfeng Gao,et al.  On the Variance of the Adaptive Learning Rate and Beyond , 2019, ICLR.

[11]  Yang Song,et al.  Generative Modeling by Estimating Gradients of the Data Distribution , 2019, NeurIPS.

[12]  Iain Murray,et al.  Neural Spline Flows , 2019, NeurIPS.

[13]  S. Ransom,et al.  PINT: High-precision pulsar timing analysis package , 2019 .

[14]  David Duvenaud,et al.  Neural Ordinary Differential Equations , 2018, NeurIPS.

[15]  Iain Murray,et al.  Masked Autoregressive Flow for Density Estimation , 2017, NIPS.

[16]  Hugo Larochelle,et al.  MADE: Masked Autoencoder for Distribution Estimation , 2015, ICML.

[17]  Andrew W. Senior,et al.  Long Short-Term Memory Based Recurrent Neural Network Architectures for Large Vocabulary Speech Recognition , 2014, ArXiv.

[18]  Simulation-based inference , 2003 .

[19]  G. Papamakarios,et al.  Fast ε-free Inference of Simulation Models with Bayesian Conditional Density Estimation , 2016, NIPS.