TimesNet: Temporal 2D-Variation Modeling for General Time Series Analysis

Time series analysis is of immense importance in extensive applications, such as weather forecasting, anomaly detection, and action recognition. This paper focuses on temporal variation modeling, which is the common key problem of extensive analysis tasks. Previous methods attempt to accomplish this directly from the 1D time series, which is extremely challenging due to the intricate temporal patterns. Based on the observation of multi-periodicity in time series, we ravel out the complex temporal variations into the multiple intraperiod- and interperiod-variations. To tackle the limitations of 1D time series in representation capability, we extend the analysis of temporal variations into the 2D space by transforming the 1D time series into a set of 2D tensors based on multiple periods. This transformation can embed the intraperiod- and interperiod-variations into the columns and rows of the 2D tensors respectively, making the 2D-variations to be easily modeled by 2D kernels. Technically, we propose the TimesNet with TimesBlock as a task-general backbone for time series analysis. TimesBlock can discover the multi-periodicity adaptively and extract the complex temporal variations from transformed 2D tensors by a parameter-efficient inception block. Our proposed TimesNet achieves consistent state-of-the-art in five mainstream time series analysis tasks, including short- and long-term forecasting, imputation, classification, and anomaly detection. Code is available at this repository: https://github.com/thuml/TimesNet.

[1]  J. Bian,et al.  Less Is More: Fast Multivariate Time Series Forecasting with Light Sampling-oriented MLP Structures , 2022, ArXiv.

[2]  L. Zhang,et al.  Are Transformers Effective for Time Series Forecasting? , 2022, AAAI.

[3]  Jianmin Wang,et al.  Flowformer: Linearizing Transformers with Conservation Flows , 2022, ICML.

[4]  S. Hoi,et al.  ETSformer: Exponential Smoothing Transformers for Time-series Forecasting , 2022, ArXiv.

[5]  Tian Zhou,et al.  FEDformer: Frequency Enhanced Decomposed Transformer for Long-term Series Forecasting , 2022, ICML.

[6]  Boris N. Oreshkin,et al.  N-HiTS: Neural Hierarchical Interpolation for Time Series Forecasting , 2022, AAAI.

[7]  Trevor Darrell,et al.  A ConvNet for the 2020s , 2022, 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).

[8]  Albert Gu,et al.  Efficiently Modeling Long Sequences with Structured State Spaces , 2021, ICLR.

[9]  Jianmin Wang,et al.  Anomaly Transformer: Time Series Anomaly Detection with Association Discrepancy , 2021, ICLR.

[10]  Tomer Lancewicki,et al.  Practical Approach to Asynchronous Multivariate Time Series Anomaly Detection and Localization , 2021, KDD.

[11]  Jianmin Wang,et al.  Autoformer: Decomposition Transformers with Auto-Correlation for Long-Term Series Forecasting , 2021, NeurIPS.

[12]  Hui Xiong,et al.  Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting , 2020, AAAI.

[13]  S. Gelly,et al.  An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale , 2020, ICLR.

[14]  Anuradha Bhamidipaty,et al.  A Transformer-based Framework for Multivariate Time Series Representation Learning , 2020, KDD.

[15]  Mark Chen,et al.  Language Models are Few-Shot Learners , 2020, NeurIPS.

[16]  Bryan Lim,et al.  Time-series forecasting with deep learning: a survey , 2020, Philosophical Transactions of the Royal Society A.

[17]  Lukasz Kaiser,et al.  Reformer: The Efficient Transformer , 2020, ICLR.

[18]  Natalia Gimelshein,et al.  PyTorch: An Imperative Style, High-Performance Deep Learning Library , 2019, NeurIPS.

[19]  Geoffrey I. Webb,et al.  ROCKET: exceptionally fast and accurate time series classification using random convolutional kernels , 2019, Data Mining and Knowledge Discovery.

[20]  Wei Sun,et al.  Robust Anomaly Detection for Multivariate Time Series through Stochastic Recurrent Neural Network , 2019, KDD.

[21]  Wenhu Chen,et al.  Enhancing the Locality and Breaking the Memory Bottleneck of Transformer on Time Series Forecasting , 2019, NeurIPS.

[22]  Jiabao Zhao,et al.  Temporal Convolutional Networks for Anomaly Detection in Time Series , 2019, Journal of Physics: Conference Series.

[23]  Nicolas Chapados,et al.  N-BEATS: Neural basis expansion analysis for interpretable time series forecasting , 2019, ICLR.

[24]  Geoffrey E. Hinton,et al.  Similarity of Neural Network Representations Revisited , 2019, ICML.

[25]  Martin Jaggi,et al.  Unsupervised Scalable Representation Learning for Multivariate Time Series , 2019, NeurIPS.

[26]  Michael Flynn,et al.  The UEA multivariate time series classification archive, 2018 , 2018, ArXiv.

[27]  Valentino Constantinou,et al.  Detecting Spacecraft Anomalies Using LSTMs and Nonparametric Dynamic Thresholding , 2018, KDD.

[28]  Benjamin Letham,et al.  Forecasting at Scale , 2018, PeerJ Prepr..

[29]  Lukasz Kaiser,et al.  Attention is All you Need , 2017, NIPS.

[30]  Guokun Lai,et al.  Modeling Long- and Short-Term Temporal Patterns with Deep Neural Networks , 2017, SIGIR.

[31]  Zhuowen Tu,et al.  Aggregated Residual Transformations for Deep Neural Networks , 2016, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[32]  Nils Ole Tippenhauer,et al.  SWaT: a water treatment testbed for research and training on ICS security , 2016, 2016 International Workshop on Cyber-physical Systems for Smart Water Networks (CySWater).

[33]  Tianqi Chen,et al.  XGBoost: A Scalable Tree Boosting System , 2016, KDD.

[34]  Jian Sun,et al.  Deep Residual Learning for Image Recognition , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[35]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[36]  George Athanasopoulos,et al.  Forecasting: principles and practice , 2013 .

[37]  Roger G. Mark,et al.  PhysioNet: Physiologic signals, time series and related open source software for basic, clinical, and applied research , 2011, 2011 Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[38]  Jürgen Schmidhuber,et al.  Long Short-Term Memory , 1997, Neural Computation.

[39]  Donald J. Berndt,et al.  Using Dynamic Time Warping to Find Patterns in Time Series , 1994, KDD Workshop.

[40]  P. A. Blight The Analysis of Time Series: An Introduction , 1991 .

[41]  Maurice G. Kendall,et al.  Time-Series. 2nd edn. , 1976 .

[42]  M. Friedman The Interpolation of Time Series by Related Series , 1962 .

[43]  Alex X. Liu,et al.  Pyraformer: Low-Complexity Pyramidal Attention for Long-Range Time Series Modeling and Forecasting , 2022, ICLR.

[44]  Jianmin Wang,et al.  Non-stationary Transformers: Rethinking the Stationarity in Time Series Forecasting , 2022, ArXiv.

[45]  Stephen Lin,et al.  Swin Transformer: Hierarchical Vision Transformer using Shifted Windows , 2021, 2021 IEEE/CVF International Conference on Computer Vision (ICCV).

[46]  Yue Zhao,et al.  Revisiting Time Series Outlier Detection: Definitions and Benchmarks , 2021, NeurIPS Datasets and Benchmarks.

[47]  Lifeng Shen,et al.  Timeseries Anomaly Detection using Temporal Hierarchical One-Class Network , 2020, NeurIPS.