Application research of several LSTM variants in power quality time series data prediction

Predicting power quality by monitoring data is one of the main topics in power quality research of power grids. With the development of smart grids, power quality monitoring data, as a key indicator for analyzing and regulating the stable transmission of power grids, is exponentially explosive. In recent years, the performance of deep learning methods in large-scale data fitting has been better and better than that of traditional methods. In this paper, based on the strict time series dependence of power quality data, combined with the Long Short-Term Memory neural network (LSTM) of deep learning algorithm, the prediction performance of several LSTM variants (Stacked LSTM, Bi-LSTM, Encoder-Decoder LSTM) on power quality time series data is researched and analyzed. Different LSTM variants are used for training and modeling. The performance comparison and analysis are carried out on the power quality data collected by a State Grid company. From the verification results, the variants have higher prediction accuracy compared with the standard LSTM network variant structure.

[1]  Wei Sun,et al.  Short Term Load Forecasting Based on BP Neural Network Trained by PSO , 2007, 2007 International Conference on Machine Learning and Cybernetics.

[2]  Paul J. Werbos,et al.  Backpropagation Through Time: What It Does and How to Do It , 1990, Proc. IEEE.

[3]  Mikael Bodén,et al.  A guide to recurrent neural networks and backpropagation , 2001 .

[4]  P J Webros BACKPROPAGATION THROUGH TIME: WHAT IT DOES AND HOW TO DO IT , 1990 .

[5]  Kuldip K. Paliwal,et al.  Bidirectional recurrent neural networks , 1997, IEEE Trans. Signal Process..

[6]  Geoffrey E. Hinton,et al.  Learning representations by back-propagating errors , 1986, Nature.

[7]  Yoshua Bengio,et al.  Learning long-term dependencies with gradient descent is difficult , 1994, IEEE Trans. Neural Networks.

[8]  Geoffrey E. Hinton,et al.  Deep Learning , 2015, Nature.

[9]  Lukás Burget,et al.  Extensions of recurrent neural network language model , 2011, 2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[10]  Yoshua Bengio,et al.  Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation , 2014, EMNLP.

[11]  Tai Neng-ling NEW PRINCIPLE BASED ON WAVELET TRANSFORM FOR POWER SYSTEM SHORT-TERM LOAD FORECASTING , 2003 .

[12]  Benjamin Schrauwen,et al.  Training and analyzing deep recurrent neural networks , 2013, NIPS 2013.

[13]  Jürgen Schmidhuber,et al.  Long Short-Term Memory , 1997, Neural Computation.

[14]  Grzegorz Dudek,et al.  Short-Term Load Forecasting Using Random Forests , 2014, IEEE Conf. on Intelligent Systems.

[15]  Mukta Paliwal,et al.  Neural networks and statistical techniques: A review of applications , 2009, Expert Syst. Appl..

[16]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[17]  Geoffrey E. Hinton,et al.  Speech recognition with deep recurrent neural networks , 2013, 2013 IEEE International Conference on Acoustics, Speech and Signal Processing.

[18]  Jürgen Schmidhuber,et al.  Recurrent nets that time and count , 2000, Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks. IJCNN 2000. Neural Computing: New Challenges and Perspectives for the New Millennium.

[19]  Dongqiang Yang,et al.  Short-term wind power forecasting based on HHT , 2016 .

[20]  Zhigang Liu,et al.  Power System Load Forecasting Based on EEMD and ANN , 2011, ISNN.

[21]  Alex Graves,et al.  Supervised Sequence Labelling with Recurrent Neural Networks , 2012, Studies in Computational Intelligence.