Block Hankel Tensor ARIMA for Multiple Short Time Series Forecasting

This work proposes a novel approach for multiple time series forecasting. At first, multi-way delay embedding transform (MDT) is employed to represent time series as low-rank block Hankel tensors (BHT). Then, the higher-order tensors are projected to compressed core tensors by applying Tucker decomposition. At the same time, the generalized tensor Autoregressive Integrated Moving Average (ARIMA) is explicitly used on consecutive core tensors to predict future samples. In this manner, the proposed approach tactically incorporates the unique advantages of MDT tensorization (to exploit mutual correlations) and tensor ARIMA coupled with low-rank Tucker decomposition into a unified framework. This framework exploits the low-rank structure of block Hankel tensors in the embedded space and captures the intrinsic correlations among multiple TS, which thus can improve the forecasting results, especially for multiple short time series. Experiments conducted on three public datasets and two industrial datasets verify that the proposed BHT-ARIMA effectively improves forecasting accuracy and reduces computational cost compared with the state-of-the-art methods.

[1]  Tianqi Chen,et al.  XGBoost: A Scalable Tree Boosting System , 2016, KDD.

[2]  Yimin Wei,et al.  Fast Hankel tensor–vector product and its application to exponential data fitting , 2015, Numer. Linear Algebra Appl..

[3]  Steven C. H. Hoi,et al.  Online ARIMA Algorithms for Time Series Prediction , 2016, AAAI.

[4]  Devavrat Shah,et al.  Model Agnostic Time Series Analysis via Matrix Estimation , 2018, Proc. ACM Meas. Anal. Comput. Syst..

[5]  Guokun Lai,et al.  Modeling Long- and Short-Term Temporal Patterns with Deep Neural Networks , 2017, SIGIR.

[6]  Qiquan Shi,et al.  Tensor Rank Estimation and Completion via CP-based Nuclear Norm , 2017, CIKM.

[7]  Hidekata Hontani,et al.  Missing Slice Recovery for Tensors Using a Low-Rank Model in Embedded Space , 2018, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.

[8]  Bin Ran,et al.  Short-Term Traffic Prediction Based on Dynamic Tensor Completion , 2016, IEEE Transactions on Intelligent Transportation Systems.

[9]  Devavrat Shah,et al.  Model Agnostic Time Series Analysis via Matrix Estimation , 2019, SIGMETRICS.

[10]  Xiaoyang Ma,et al.  Large-scale User Visits Understanding and Forecasting with Deep Spatial-Temporal Tensor Factorization Framework , 2019, KDD.

[11]  Mehdi Khashei,et al.  A novel hybridization of artificial neural networks and ARIMA models for time series forecasting , 2011, Appl. Soft Comput..

[12]  Andrzej Cichocki,et al.  Robust Multilinear Tensor Rank Estimation Using Higher Order Singular Value Decomposition and Information Criteria , 2017, IEEE Transactions on Signal Processing.

[13]  Inderjit S. Dhillon,et al.  Temporal Regularized Matrix Factorization for High-dimensional Time Series Prediction , 2016, NIPS.

[14]  Christos Faloutsos,et al.  TensorCast : Forecasting with Context using Coupled Tensors , 2017 .

[15]  Valentin Flunkert,et al.  DeepAR: Probabilistic Forecasting with Autoregressive Recurrent Networks , 2017, International Journal of Forecasting.

[16]  John F. MacGregor,et al.  Some Recent Advances in Forecasting and Control , 1968 .

[17]  Shalini Priya,et al.  Forecasting Traffic Flow in Big Cities Using Modified Tucker Decomposition , 2018, ADMA.

[18]  Xiao Jin,et al.  High-Order Temporal Correlation Model Learning for Time-Series Prediction , 2019, IEEE Transactions on Cybernetics.

[19]  Zhanxing Zhu,et al.  Spatio-temporal Graph Convolutional Neural Network: A Deep Learning Framework for Traffic Forecasting , 2017, IJCAI.

[20]  Hadi Fanaee-T,et al.  Tensor-based anomaly detection: An interdisciplinary survey , 2016, Knowl. Based Syst..

[21]  Christos Faloutsos,et al.  Forecasting Big Time Series: Old and New , 2018, Proc. VLDB Endow..

[22]  Qiquan Shi,et al.  Feature Extraction for Incomplete Data Via Low-Rank Tensor Decomposition With Feature Regularization , 2019, IEEE Transactions on Neural Networks and Learning Systems.

[23]  Xueqi Cheng,et al.  NeuCast: Seasonal Neural Forecast of Power Grid Time Series , 2018, IJCAI.

[24]  Hsinchun Chen,et al.  Tensor-Based Learning for Predicting Stock Movements , 2015, AAAI.

[25]  Guoqiang Peter Zhang,et al.  Time series forecasting using a hybrid ARIMA and neural network model , 2003, Neurocomputing.

[26]  Hidekata Hontani,et al.  Manifold Modeling in Embedded Space: A Perspective for Interpreting "Deep Image Prior" , 2019, ArXiv.

[27]  Hidekata Hontani,et al.  Tensor Completion with Shift-invariant Cosine Bases , 2018, 2018 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA ASC).

[28]  Benjamin Letham,et al.  Forecasting at Scale , 2018, PeerJ Prepr..

[29]  Yoshua Bengio,et al.  Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation , 2014, EMNLP.

[30]  Yisong Yue,et al.  Long-term Forecasting using Higher Order Tensor RNNs , 2017 .

[31]  Haiping Lu,et al.  Probabilistic Rank-One Tensor Analysis With Concurrent Regularizations , 2019, IEEE Transactions on Cybernetics.

[32]  Tamara G. Kolda,et al.  Tensor Decompositions and Applications , 2009, SIAM Rev..

[33]  Lei Li,et al.  Multilinear Dynamical Systems for Tensor Time Series , 2013, NIPS.

[34]  Christos Faloutsos,et al.  Forecasting Big Time Series: Theory and Practice , 2019, KDD.

[35]  Tamara G. Kolda,et al.  Temporal Link Prediction Using Matrix and Tensor Factorizations , 2010, TKDD.

[36]  Lijun Sun,et al.  Bayesian Temporal Factorization for Multidimensional Time Series Prediction , 2019, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[37]  Yisong Yue,et al.  Long-term Forecasting using Tensor-Train RNNs , 2017, ArXiv.

[38]  Yang Zhou,et al.  Bayesian Low-Tubal-Rank Robust Tensor Factorization with Multi-Rank Determination , 2021, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[39]  Andrzej Cichocki,et al.  Tensor Networks for Dimensionality Reduction and Large-scale Optimization: Part 1 Low-Rank Tensor Decompositions , 2016, Found. Trends Mach. Learn..