Sequential Neural Processes

Neural Processes combine the strengths of neural networks and Gaussian processes to achieve both flexible learning and fast prediction in stochastic processes. However, a large class of problems comprises underlying temporal dependency structures in a sequence of stochastic processes that Neural Processes (NP) do not explicitly consider. In this paper, we propose Sequential Neural Processes (SNP) which incorporates a temporal state-transition model of stochastic processes and thus extends its modeling capabilities to dynamic stochastic processes. In applying SNP to dynamic 3D scene modeling, we introduce the Temporal Generative Query Networks. To our knowledge, this is the first 4D model that can deal with the temporal dynamics of 3D scenes. In experiments, we evaluate the proposed methods in dynamic (non-stationary) regression and 4D scene inference and rendering.

[1]  Alexander J. Smola,et al.  State Space LSTM Models with Particle MCMC Inference , 2017, ArXiv.

[2]  Ruben Villegas,et al.  Learning Latent Dynamics for Planning from Pixels , 2018, ICML.

[3]  Daan Wierstra,et al.  Towards Conceptual Compression , 2016, NIPS.

[4]  David Amos,et al.  Generative Temporal Models with Memory , 2017, ArXiv.

[5]  Geoffrey E. Hinton,et al.  Rectified Linear Units Improve Restricted Boltzmann Machines , 2010, ICML.

[6]  Fabio Viola,et al.  Learning models for visual 3D localization with implicit mapping , 2018, ArXiv.

[7]  Nitish Srivastava,et al.  Unsupervised Learning of Video Representations using LSTMs , 2015, ICML.

[8]  Yoshua Bengio,et al.  Z-Forcing: Training Stochastic Recurrent Networks , 2017, NIPS.

[9]  Yuan Yu,et al.  TensorFlow: A system for large-scale machine learning , 2016, OSDI.

[10]  Koray Kavukcuoglu,et al.  Neural scene representation and rendering , 2018, Science.

[11]  Murray Shanahan,et al.  Consistent Generative Query Networks , 2018, ArXiv.

[12]  Samy Bengio,et al.  Scheduled Sampling for Sequence Prediction with Recurrent Neural Networks , 2015, NIPS.

[13]  Maximilian Karl,et al.  Deep Variational Bayes Filters: Unsupervised Learning of State Space Models from Raw Data , 2016, ICLR.

[14]  Dit-Yan Yeung,et al.  Convolutional LSTM Network: A Machine Learning Approach for Precipitation Nowcasting , 2015, NIPS.

[15]  Max Welling,et al.  Auto-Encoding Variational Bayes , 2013, ICLR.

[16]  Yee Whye Teh,et al.  Attentive Neural Processes , 2019, ICLR.

[17]  Mark A. Lewis,et al.  State-space models’ dirty little secrets: even simple linear Gaussian models can have estimation problems , 2015, Scientific Reports.

[18]  Fabio Viola,et al.  Learning and Querying Fast Generative Models for Reinforcement Learning , 2018, ArXiv.

[19]  Samy Bengio,et al.  Generating Sentences from a Continuous Space , 2015, CoNLL.

[20]  Yoshua Bengio,et al.  A Recurrent Latent Variable Model for Sequential Data , 2015, NIPS.

[21]  Ole Winther,et al.  Sequential Neural Models with Stochastic Layers , 2016, NIPS.

[22]  Fabio Viola,et al.  Generative Temporal Models with Spatial Memory for Partially Observed Environments , 2018, ICML.

[23]  James Hensman,et al.  Identification of Gaussian Process State Space Models , 2017, NIPS.

[24]  Uri Shalit,et al.  Structured Inference Networks for Nonlinear State Space Models , 2016, AAAI.

[25]  Ole Winther,et al.  A Disentangled Recognition and Nonlinear Dynamics Model for Unsupervised Learning , 2017, NIPS.

[26]  Emanuel Todorov,et al.  Ensemble-CIO: Full-body dynamic motion planning that transfers to physical humanoids , 2015, 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).