Bootstrapping Neural Processes

Unlike in the traditional statistical modeling for which a user typically hand-specify a prior, Neural Processes (NPs) implicitly define a broad class of stochastic processes with neural networks. Given a data stream, NP learns a stochastic process that best describes the data. While this "data-driven" way of learning stochastic processes has proven to handle various types of data, NPs still rely on an assumption that uncertainty in stochastic processes is modeled by a single latent variable, which potentially limits the flexibility. To this end, we propose the Boostrapping Neural Process (BNP), a novel extension of the NP family using the bootstrap. The bootstrap is a classical data-driven technique for estimating uncertainty, which allows BNP to learn the stochasticity in NPs without assuming a particular form. We demonstrate the efficacy of BNP on various types of data and its robustness in the presence of model-data mismatch.

[1]  Donald R. Jones,et al.  Efficient Global Optimization of Expensive Black-Box Functions , 1998, J. Glob. Optim..

[2]  W. Doolittle,et al.  Comparison of Bayesian and maximum likelihood bootstrap measures of phylogenetic reliability. , 2003, Molecular biology and evolution.

[3]  Eric R. Ziegel,et al.  The Elements of Statistical Learning , 2003, Technometrics.

[4]  Leo Breiman,et al.  Bagging Predictors , 1996, Machine Learning.

[5]  Darren J. Wilkinson Stochastic Modelling for Systems Biology , 2006 .

[6]  David Hinkley,et al.  Bootstrap Methods: Another Look at the Jackknife , 2008 .

[7]  Carl E. Rasmussen,et al.  Gaussian processes for machine learning , 2005, Adaptive computation and machine learning.

[8]  Nando de Freitas,et al.  A Tutorial on Bayesian Optimization of Expensive Cost Functions, with Application to Active User Modeling and Hierarchical Reinforcement Learning , 2010, ArXiv.

[9]  Xiaogang Wang,et al.  Deep Learning Face Attributes in the Wild , 2014, 2015 IEEE International Conference on Computer Vision (ICCV).

[10]  Dumitru Erhan,et al.  Training Deep Neural Networks on Noisy Labels with Bootstrapping , 2014, ICLR.

[11]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[12]  Ruslan Salakhutdinov,et al.  Importance Weighted Autoencoders , 2015, ICLR.

[13]  Benjamin Van Roy,et al.  Deep Exploration via Bootstrapped DQN , 2016, NIPS.

[14]  Geoffrey E. Hinton,et al.  Layer Normalization , 2016, ArXiv.

[15]  Lukasz Kaiser,et al.  Attention is All you Need , 2017, NIPS.

[16]  Amos J. Storkey,et al.  Towards a Neural Statistician , 2016, ICLR.

[17]  Eric T. Nalisnick,et al.  The Amortized Bootstrap , 2017 .

[18]  Charles Blundell,et al.  Simple and Scalable Predictive Uncertainty Estimation using Deep Ensembles , 2016, NIPS.

[19]  Gregory Cohen,et al.  EMNIST: an extension of MNIST to handwritten letters , 2017, CVPR 2017.

[20]  Alexander J. Smola,et al.  Deep Sets , 2017, 1703.06114.

[21]  Tuan Anh Le,et al.  Empirical Evaluation of Neural Process Objectives , 2018 .

[22]  Stefano Ermon,et al.  Accurate Uncertainties for Deep Learning Using Calibrated Regression , 2018, ICML.

[23]  Yee Whye Teh,et al.  Neural Processes , 2018, ArXiv.

[24]  Yee Whye Teh,et al.  Conditional Neural Processes , 2018, ICML.

[25]  Jeffrey W. Miller,et al.  Using bagged posteriors for robust inference and model criticism , 2019, 1912.07104.

[26]  Christian Osendorfer,et al.  Recurrent Neural Processes , 2019, ArXiv.

[27]  Klamer Schutte,et al.  The Functional Neural Process , 2019, NeurIPS.

[28]  Yee Whye Teh,et al.  Attentive Neural Processes , 2019, ICLR.

[29]  Sungjin Ahn,et al.  Sequential Neural Processes , 2019, NeurIPS.

[30]  Richard E. Turner,et al.  Convolutional Conditional Neural Processes , 2019, ICLR.