Fast Parameter Inference on Pulsar Timing Arrays with Normalizing Flows
暂无分享,去创建一个
[1] B. C. Joshi,et al. The second data release from the European Pulsar Timing Array , 2023, Astronomy & Astrophysics.
[2] B. C. Joshi,et al. The second data release from the European Pulsar Timing Array. III. Search for gravitational wave signals , 2023, Astronomy & Astrophysics.
[3] G. Papamakarios,et al. nflows: normalizing flows in PyTorch , 2020 .
[4] Abhishek Kumar,et al. Score-Based Generative Modeling through Stochastic Differential Equations , 2020, ICLR.
[5] Pieter Abbeel,et al. Denoising Diffusion Probabilistic Models , 2020, NeurIPS.
[6] Stefano Ermon,et al. Improved Techniques for Training Score-Based Generative Models , 2020, NeurIPS.
[7] Eric Nalisnick,et al. Normalizing Flows for Probabilistic Modeling and Inference , 2019, J. Mach. Learn. Res..
[8] Natalia Gimelshein,et al. PyTorch: An Imperative Style, High-Performance Deep Learning Library , 2019, NeurIPS.
[9] Stephen Taylor,et al. ENTERPRISE: Enhanced Numerical Toolbox Enabling a Robust PulsaR Inference SuitE , 2019 .
[10] Jianfeng Gao,et al. On the Variance of the Adaptive Learning Rate and Beyond , 2019, ICLR.
[11] Yang Song,et al. Generative Modeling by Estimating Gradients of the Data Distribution , 2019, NeurIPS.
[12] Iain Murray,et al. Neural Spline Flows , 2019, NeurIPS.
[13] S. Ransom,et al. PINT: High-precision pulsar timing analysis package , 2019 .
[14] David Duvenaud,et al. Neural Ordinary Differential Equations , 2018, NeurIPS.
[15] Iain Murray,et al. Masked Autoregressive Flow for Density Estimation , 2017, NIPS.
[16] Hugo Larochelle,et al. MADE: Masked Autoencoder for Distribution Estimation , 2015, ICML.
[17] Andrew W. Senior,et al. Long Short-Term Memory Based Recurrent Neural Network Architectures for Large Vocabulary Speech Recognition , 2014, ArXiv.
[18] Simulation-based inference , 2003 .
[19] G. Papamakarios,et al. Fast ε-free Inference of Simulation Models with Bayesian Conditional Density Estimation , 2016, NIPS.