A Functional Dynamic Boltzmann Machine

Dynamic Boltzmann machines (DyBMs) are recently developed generative models of a time series. They are designed to learn a time series by efficient online learning algorithms, whilst taking long-term dependencies into account with help of eligibility traces, recursively updatable memory units storing descriptive statistics of all the past data. The current DyBMs assume a finitedimensional time series and cannot be applied to a functional time series, in which the dimension goes to infinity (e.g., spatiotemporal data on a continuous space). In this paper, we present a functional dynamic Boltzmann machine (F-DyBM) as a generative model of a functional time series. A technical challenge is to devise an online learning algorithm with which F-DyBM, consisting of functions and integrals, can learn a functional time series using only finite observations of it. We rise to the above challenge by combining a kernel-based function approximation method along with a statistical interpolation method and finally derive closed-form update rules. We design numerical experiments to empirically confirm the effectiveness of our solutions. The experimental results demonstrate consistent error reductions as compared to baseline methods, from which we conclude the effectiveness of F-DyBM for functional time series prediction.

[1]  R. Shumway,et al.  AN APPROACH TO TIME SERIES SMOOTHING AND FORECASTING USING THE EM ALGORITHM , 1982 .

[2]  Hermann Ney,et al.  LSTM Neural Networks for Language Modeling , 2012, INTERSPEECH.

[3]  J. Meigs,et al.  WHO Technical Report , 1954, The Yale Journal of Biology and Medicine.

[4]  Takayuki Osogami,et al.  Learning binary or real-valued time-series via spike-timing dependent plasticity , 2016, ArXiv.

[5]  Jeffrey L. Elman,et al.  Finding Structure in Time , 1990, Cogn. Sci..

[6]  Guido Sanguinetti,et al.  Advances in Neural Information Processing Systems 24 , 2011 .

[7]  Xi Zhang,et al.  Empirical properties of forecasts with the functional autoregressive model , 2012, Comput. Stat..

[8]  Serge Guillas,et al.  The inclusion of exogenous variables in functional autoregressive ozone forecasting , 2002 .

[9]  Alexei Onatski,et al.  Curve Forecasting by Functional Autoregression , 2008 .

[10]  D. Bosq Linear Processes in Function Spaces: Theory And Applications , 2000 .

[11]  Geoffrey E. Hinton,et al.  Learning internal representations by error propagation , 1986 .

[12]  Jürgen Schmidhuber,et al.  Long Short-Term Memory , 1997, Neural Computation.

[13]  Luc De Raedt,et al.  Proceedings of the 22nd international conference on Machine learning , 2005 .

[14]  Andrew Gordon Wilson,et al.  Thoughts on Massively Scalable Gaussian Processes , 2015, ArXiv.

[15]  M. V. Rossum,et al.  In Neural Computation , 2022 .

[16]  Philippe C. Besse,et al.  Autoregressive Forecasting of Some Functional Climatic Variations , 2000 .

[17]  Jeffrey Dean,et al.  Distributed Representations of Words and Phrases and their Compositionality , 2013, NIPS.

[18]  Louise Poissant Part I , 1996, Leonardo.

[19]  Colin Rose Computational Statistics , 2011, International Encyclopedia of Statistical Science.

[20]  G. Miller,et al.  Cognitive science. , 1981, Science.

[21]  Rob J. Hyndman,et al.  Robust forecasting of mortality and fertility rates: A functional data approach , 2007, Comput. Stat. Data Anal..

[22]  Carl E. Rasmussen,et al.  A Unifying View of Sparse Approximate Gaussian Process Regression , 2005, J. Mach. Learn. Res..

[23]  Elad Hazan,et al.  Online Time Series Prediction with Missing Data , 2015, ICML.

[24]  Xiaoming Wang,et al.  Chapter 8 , 2003, The Sermons and Liturgy of Saint James.

[25]  Helmut Ltkepohl,et al.  New Introduction to Multiple Time Series Analysis , 2007 .

[26]  Harald Haas,et al.  Harnessing Nonlinearity: Predicting Chaotic Systems and Saving Energy in Wireless Communication , 2004, Science.

[27]  Takayuki Osogami,et al.  Nonlinear Dynamic Boltzmann Machines for Time-Series Prediction , 2017, AAAI.