Load forecasting using deep neural networks

Short-term electricity demand prediction is of great importance to power companies since it is required to ensure adequate capacity when needed and, in some cases, it is needed to estimate the supply of raw material (e.g., natural gas) required to produce the required capacity. The deregulation of the power industry in many countries has magnified the importance of this need. Research in this area has included the use of shallow neural networks and other machine learning algorithms to solve this problem. However, recent results in other areas, such as Computer Vision and Speech Recognition, have shown great promise for Deep Neural Networks (DNN). Unfortunately, far less research exists on the application of DNN to short-term load forecasting. In this paper, we apply DNN as well as other machine learning techniques to short-term load forecasting in a power grid. The data used is taken from periodic smart meter energy usage reports. Our results indicate that DNN performs quite well when compared to traditional approaches. We also show how these results can be used if dynamic pricing is introduced to reduce peak loading.

[1]  Hesham K. Alfares,et al.  Electric load forecasting: Literature survey and classification of methods , 2002, Int. J. Syst. Sci..

[2]  Alexander J. Smola,et al.  Support Vector Regression Machines , 1996, NIPS.

[3]  Geoffrey E. Hinton,et al.  ImageNet classification with deep convolutional neural networks , 2012, Commun. ACM.

[4]  Yoshua Bengio,et al.  Gradient-based learning applied to document recognition , 1998, Proc. IEEE.

[5]  Tara N. Sainath,et al.  Deep Neural Networks for Acoustic Modeling in Speech Recognition: The Shared Views of Four Research Groups , 2012, IEEE Signal Processing Magazine.

[6]  Tao Hong,et al.  A Naïve multiple linear regression benchmark for short term load forecasting , 2011, 2011 IEEE Power and Energy Society General Meeting.

[7]  J. Orbach Principles of Neurodynamics. Perceptrons and the Theory of Brain Mechanisms. , 1962 .

[8]  Rob J Hyndman,et al.  25 years of time series forecasting , 2006 .

[9]  Tara N. Sainath,et al.  Deep Neural Networks for Acoustic Modeling in Speech Recognition , 2012 .

[10]  Michael Negnevitsky,et al.  Artificial Intelligence: A Guide to Intelligent Systems , 2001 .

[11]  Razvan Pascanu,et al.  Theano: new features and speed improvements , 2012, ArXiv.

[12]  Yue Zhang,et al.  Deep Learning for Event-Driven Stock Prediction , 2015, IJCAI.

[13]  Les E. Atlas,et al.  Recurrent Networks and NARMA Modeling , 1991, NIPS.

[14]  Jason Weston,et al.  A unified architecture for natural language processing: deep neural networks with multitask learning , 2008, ICML '08.

[15]  Luís Torgo Regression Trees , 2010, Encyclopedia of Machine Learning.

[16]  Yoshua Bengio,et al.  Why Does Unsupervised Pre-training Help Deep Learning? , 2010, AISTATS.

[17]  Guigang Zhang,et al.  Deep Learning , 2016, Int. J. Semantic Comput..

[18]  Yi Zheng,et al.  Time Series Classification Using Multi-Channels Deep Convolutional Neural Networks , 2014, WAIM.

[19]  F. Gers,et al.  Long short-term memory in recurrent neural networks , 2001 .

[20]  Xiaoli Li,et al.  Deep Convolutional Neural Networks on Multichannel Time Series for Human Activity Recognition , 2015, IJCAI.

[21]  Ian Osband,et al.  Deep Learning for Time Series Modeling CS 229 Final Project Report , 2012 .

[22]  Geoffrey E. Hinton,et al.  Learning representations by back-propagating errors , 1986, Nature.

[23]  Tara N. Sainath,et al.  Convolutional, Long Short-Term Memory, fully connected Deep Neural Networks , 2015, 2015 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).