LSTM networks for the trend prediction of gases dissolved in power transformer insulation oil

Forecasting the accurate dissolved gases content in power transformer oil can provide important basis for transformer condition assessment. In this paper, a new forecasting model based on long short-term memory networks is proposed. The sequence data of gas dissolved in oil are trained by long short-term memory network to explore the forecast characteristic. The characteristics contain the correlations between historical status monitoring information and the gases content in the forecasting time. The development trend of gases content is automatically extracted. The case studies show that the proposed method can effectively forecast gases content dissolved in transformer oil. The model achieves greater forecasting accuracy than grey model, back propagation network model and support vector machine model. It overcomes drawbacks of low stability in traditional methods and shortcoming of considering only one characteristic gas.

[1]  Aydogan Savran,et al.  Multifeedback-Layer Neural Network , 2007, IEEE Transactions on Neural Networks.

[2]  R. Rogers IEEE and IEC Codes to Interpret Incipient Faults in Transformers, Using Gas in Oil Analysis , 1978, IEEE Transactions on Electrical Insulation.

[3]  Lijun Yang,et al.  Fuzzy information granulated particle swarm optimisation-support vector machine regression for the trend forecasting of dissolved gases in oil-filled transformers , 2011 .

[4]  Lijun Yang,et al.  Forecasting dissolved gases content in power transformer oil based on weakening buffer operator and least square support vector machine–Markov , 2012 .

[5]  J.J. Kelly,et al.  Transformer life extension through proper reinhibiting and preservation of the oil insulation , 1993, Industry Applications Society 40th Annual Petroleum and Chemical Industry Conference.

[6]  Yee Whye Teh,et al.  A Fast Learning Algorithm for Deep Belief Nets , 2006, Neural Computation.

[7]  M. Duval,et al.  Dissolved gas analysis: It can save your transformer , 1989, IEEE Electrical Insulation Magazine.

[8]  Wei Xu,et al.  End-to-end learning of semantic role labeling using recurrent neural networks , 2015, ACL.

[9]  Jürgen Schmidhuber,et al.  Long Short-Term Memory , 1997, Neural Computation.

[10]  Hermann Ney,et al.  From Feedforward to Recurrent LSTM Neural Networks for Language Modeling , 2015, IEEE/ACM Transactions on Audio, Speech, and Language Processing.

[11]  Hongsheng Su,et al.  An ANFIS-based Transformer Insulation Fault Diagnosis Method Using Emotional Learning , 2007, Third International Conference on Natural Computation (ICNC 2007).

[12]  Michel Duval,et al.  Interpretation of gas-in-oil analysis using new IEC publication 60599 and IEC TC 10 databases , 2001 .

[13]  Paris Smaragdis,et al.  Joint Optimization of Masks and Deep Recurrent Neural Networks for Monaural Source Separation , 2015, IEEE/ACM Transactions on Audio, Speech, and Language Processing.

[14]  Zhao Ji-yin,et al.  Prediction of Power Transformer Oil Dissolved Gas Concentration Based on Modified Gray Model , 2010, 2010 International Conference on Electrical and Control Engineering.

[15]  Michel Duval,et al.  A review of faults detectable by gas-in-oil analysis in transformers , 2002 .