Neural NILM: Deep Neural Networks Applied to Energy Disaggregation

Energy disaggregation estimates appliance-by-appliance electricity consumption from a single meter that measures the whole home's electricity demand. Recently, deep neural networks have driven remarkable improvements in classification performance in neighbouring machine learning fields such as image classification and automatic speech recognition. In this paper, we adapt three deep neural network architectures to energy disaggregation: 1) a form of recurrent neural network called `long short-term memory' (LSTM); 2) denoising autoencoders; and 3) a network which regresses the start time, end time and average power demand of each appliance activation. We use seven metrics to test the performance of these algorithms on real aggregate power data from five appliances. Tests are performed against a house not seen during training and against houses seen during training. We find that all three neural nets achieve better F1 scores (averaged over all five appliances) than either combinatorial optimisation or factorial hidden Markov models and that our neural net algorithms generalise well to an unseen house.

[1]  Gerhard P. Hancke,et al.  Using neural networks for non-intrusive monitoring of industrial electrical loads , 1994, Conference Proceedings. 10th Anniversary. IMTC/94. Advanced Technologies in I & M. 1994 IEEE Instrumentation and Measurement Technolgy Conference (Cat. No.94CH3424-9).

[2]  S. R. Shaw,et al.  Transient event detection in spectral envelope estimates for nonintrusive load monitoring , 1995 .

[3]  A. Schoofs,et al.  Real-Time Recognition and Profiling of Appliances through a Single Electricity Sensor , 2010, 2010 7th Annual IEEE Communications Society Conference on Sensor, Mesh and Ad Hoc Communications and Networks (SECON).

[4]  Hong-Tzer Yang,et al.  Design a Neural Network for Features Selection in Non-intrusive Monitoring of Industrial Electrical Loads , 2007, 2007 11th International Conference on Computer Supported Cooperative Work in Design.

[5]  Geoffrey E. Hinton,et al.  ImageNet classification with deep convolutional neural networks , 2012, Commun. ACM.

[6]  森谷 祐一,et al.  Advanced Technologies , 2005, Contemporary Endoscopic Spine Surgery.

[7]  Ronald J. Williams,et al.  Gradient-based learning algorithms for recurrent networks and their computational complexity , 1995 .

[8]  Yoshua Bengio,et al.  Gradient-based learning applied to document recognition , 1998, Proc. IEEE.

[9]  Quoc V. Le,et al.  Sequence to Sequence Learning with Neural Networks , 2014, NIPS.

[10]  Kunihiko Fukushima,et al.  Neocognitron: A self-organizing neural network model for a mechanism of pattern recognition unaffected by shift in position , 1980, Biological Cybernetics.

[11]  G. W. Hart,et al.  Nonintrusive appliance load monitoring , 1992, Proc. IEEE.

[12]  Jack Kelly,et al.  The UK-DALE dataset, domestic appliance-level electricity demand and whole-house demand from five UK homes , 2014, Scientific Data.

[13]  Haimonti Dutta,et al.  NILMTK: an open source toolkit for non-intrusive load monitoring , 2014, e-Energy.

[14]  David G. Lowe,et al.  Object recognition from local scale-invariant features , 1999, Proceedings of the Seventh IEEE International Conference on Computer Vision.

[15]  Yoshua Bengio,et al.  End-to-end Continuous Speech Recognition using Attention-based Recurrent NN: First Results , 2014, ArXiv.

[16]  Yee Whye Teh,et al.  A Fast Learning Algorithm for Deep Belief Nets , 2006, Neural Computation.

[17]  Yoshua Bengio,et al.  Scaling learning algorithms towards AI , 2007 .

[18]  Jürgen Schmidhuber,et al.  Long Short-Term Memory , 1997, Neural Computation.

[19]  Alex Graves,et al.  Supervised Sequence Labelling with Recurrent Neural Networks , 2012, Studies in Computational Intelligence.

[20]  Geoffrey E. Hinton,et al.  Learning internal representations by error propagation , 1986 .

[21]  J. Zico Kolter,et al.  REDD : A Public Data Set for Energy Disaggregation Research , 2011 .

[22]  Navdeep Jaitly,et al.  Towards End-To-End Speech Recognition with Recurrent Neural Networks , 2014, ICML.

[23]  Robert J. Marks,et al.  An Artificial Neural Network for Spatio-Temporal Bipolar Patterns: Application to Phoneme Classification , 1987, NIPS.

[24]  Alex Graves,et al.  Generating Sequences With Recurrent Neural Networks , 2013, ArXiv.

[25]  Razvan Pascanu,et al.  Theano: new features and speed improvements , 2012, ArXiv.

[26]  Hsueh-Hsien Chang,et al.  Feature Extraction of Non-intrusive Load-Monitoring System Using Genetic Algorithm in Smart Meters , 2011, 2011 IEEE 8th International Conference on e-Business Engineering.

[27]  Claude Jauffret,et al.  A new approach for event detection and feature extraction for NILM , 2014, 2014 21st IEEE International Conference on Electronics, Circuits and Systems (ICECS).

[28]  Yu-Hsiu Lin,et al.  A novel feature extraction method for the development of nonintrusive load monitoring system based on BP-ANN , 2010, 2010 International Symposium on Computer, Communication, Control and Automation (3CA).

[29]  PAUL J. WERBOS,et al.  Generalization of backpropagation with application to a recurrent gas market model , 1988, Neural Networks.

[30]  Razvan Pascanu,et al.  Theano: A CPU and GPU Math Compiler in Python , 2010, SciPy.

[31]  Paul J. Werbos,et al.  Backpropagation Through Time: What It Does and How to Do It , 1990, Proc. IEEE.

[32]  Yoshua Bengio,et al.  Extracting and composing robust features with denoising autoencoders , 2008, ICML '08.

[33]  Shane Legg,et al.  Human-level control through deep reinforcement learning , 2015, Nature.