Time Series Data Prediction Based on Sequence to Sequence Model

Home appliance test is an essential part of the R&D process. However, since home appliance test must be carried out under certain circumstances, it makes the development cycle longer. If it is possible to predicting the home appliance test data, it will improve the efficiency of home appliance development. Sequence to sequence model has been proved that it has a strong ability to map the input sequence and output sequence and it has been widely used in machine translation tasks. In order to solve the home appliance test data prediction problem, we firstly tried to apply sequence to sequence model to predict numerically continuous time series data. Our experiments proved that our model can predict the data of relatively long-time steps when inputting data of relatively short time steps. In our experiment, the mean absolute percentage error of the prediction of our model is about 0.1%. Our approach has a strong generalization ability and performs well in different experimental scenarios. Finally, we believe that the sequence to sequence method performs satisfactorily on prediction problems for continuous time series data. With this model, we can obtain more accurate results than traditional methods and it is meaningful for the R&D of home appliances.

[1]  Christoph Meinel,et al.  Image Captioning with Deep Bidirectional LSTMs , 2016, ACM Multimedia.

[2]  Jürgen Schmidhuber,et al.  Long Short-Term Memory , 1997, Neural Computation.

[3]  Rajib Rana,et al.  Gated Recurrent Unit (GRU) for Emotion Classification from Noisy Speech , 2016, ArXiv.

[4]  M. W Gardner,et al.  Artificial neural networks (the multilayer perceptron)—a review of applications in the atmospheric sciences , 1998 .

[5]  Sepp Hochreiter,et al.  The Vanishing Gradient Problem During Learning Recurrent Neural Nets and Problem Solutions , 1998, Int. J. Uncertain. Fuzziness Knowl. Based Syst..

[6]  Yoshimasa Tsuruoka,et al.  Tree-to-Sequence Attentional Neural Machine Translation , 2016, ACL.

[7]  James T. Kwok,et al.  Constructive algorithms for structure learning in feedforward neural networks for regression problems , 1997, IEEE Trans. Neural Networks.

[8]  Alex Graves,et al.  Sequence Transduction with Recurrent Neural Networks , 2012, ArXiv.

[9]  Geoffrey E. Hinton,et al.  Speech recognition with deep recurrent neural networks , 2013, 2013 IEEE International Conference on Acoustics, Speech and Signal Processing.

[10]  Jürgen Schmidhuber,et al.  Learning to Forget: Continual Prediction with LSTM , 2000, Neural Computation.

[11]  Trevor Darrell,et al.  Sequence to Sequence -- Video to Text , 2015, 2015 IEEE International Conference on Computer Vision (ICCV).

[12]  Shashi Pal Singh,et al.  Machine translation using deep learning: An overview , 2017, 2017 International Conference on Computer, Communications and Electronics (Comptelix).

[13]  Yoshua Bengio,et al.  Hierarchical Recurrent Neural Networks for Long-Term Dependencies , 1995, NIPS.

[14]  Geoffrey E. Hinton,et al.  Autoencoders, Minimum Description Length and Helmholtz Free Energy , 1993, NIPS.

[15]  Razvan Pascanu,et al.  How to Construct Deep Recurrent Neural Networks , 2013, ICLR.

[16]  Jiajun Zhang,et al.  Deep Neural Networks in Machine Translation: An Overview , 2015, IEEE Intelligent Systems.

[17]  Yoshua Bengio,et al.  Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation , 2014, EMNLP.

[18]  Zachary Chase Lipton A Critical Review of Recurrent Neural Networks for Sequence Learning , 2015, ArXiv.

[19]  Chung Choo Chung,et al.  Sequence-to-Sequence Prediction of Vehicle Trajectory via LSTM Encoder-Decoder Architecture , 2018, 2018 IEEE Intelligent Vehicles Symposium (IV).

[20]  Jeffrey Dean,et al.  Distributed Representations of Words and Phrases and their Compositionality , 2013, NIPS.

[21]  Wojciech Zaremba,et al.  An Empirical Exploration of Recurrent Network Architectures , 2015, ICML.