Energy Disaggregation via Discriminative Sparse Coding

Energy disaggregation is the task of taking a whole-home energy signal and separating it into its component appliances. Studies have shown that having device-level energy information can cause users to conserve significant amounts of energy, but current electricity meters only report whole-home data. Thus, developing algorithmic methods for disaggregation presents a key technical challenge in the effort to maximize energy conservation. In this paper, we examine a large scale energy disaggregation task, and apply a novel extension of sparse coding to this problem. In particular, we develop a method, based upon structured prediction, for discriminatively training sparse coding algorithms specifically to maximize disaggregation performance. We show that this significantly improves the performance of sparse coding algorithms on the energy task and illustrate how these disaggregation results can provide useful information about energy usage.

[1]  End Use Annual energy review , 1984 .

[2]  F. Sultanem,et al.  Using appliance signatures for monitoring residential loads at meter panel level , 1991 .

[3]  G. W. Hart,et al.  Nonintrusive appliance load monitoring , 1992, Proc. IEEE.

[4]  David J. Field,et al.  Emergence of simple-cell receptive field properties by learning a sparse code for natural images , 1996, Nature.

[5]  Steven B. Leeb,et al.  Instrumentation for High Performance Nonintrusive Electrical Load Monitoring , 1998 .

[6]  Sam T. Roweis,et al.  One Microphone Source Separation , 2000, NIPS.

[7]  Patrik O. Hoyer,et al.  Non-negative sparse coding , 2002, Proceedings of the 12th IEEE Workshop on Neural Networks for Signal Processing.

[8]  Michael Collins,et al.  Discriminative Training Methods for Hidden Markov Models: Theory and Experiments with Perceptron Algorithms , 2002, EMNLP.

[9]  Steven B. Leeb,et al.  Power signature analysis , 2003 .

[10]  J. Eggert,et al.  Sparse coding and NMF , 2004, 2004 IEEE International Joint Conference on Neural Networks (IEEE Cat. No.04CH37541).

[11]  Mark Lucente,et al.  Exploration on Load Signatures , 2004 .

[12]  Mike E. Davies,et al.  On Shift-Invariant Sparse Coding , 2004, ICA.

[13]  Ben Taskar,et al.  Learning structured prediction models: a large margin approach , 2005, ICML.

[14]  Mikkel N. Schmidt,et al.  Single-channel speech separation using sparse non-negative matrix factorization , 2006, INTERSPEECH.

[15]  M. Yuan,et al.  Model selection and estimation in regression with grouped variables , 2006 .

[16]  Heiko Wersing,et al.  Combining Reconstruction and Discrimination with Class-Specific Sparse Coding , 2007, Neural Computation.

[17]  Roger B. Grosse,et al.  Shift-Invariance Sparse Coding for Audio Classification , 2007, UAI.

[18]  Gregory D. Abowd,et al.  At the Flick of a Switch: Detecting and Classifying Unique Electrical Events on the Residential Power Line (Nominated for the Best Paper Award) , 2007, UbiComp.

[19]  J. Larsen,et al.  Wind Noise Reduction using Non-Negative Sparse Coding , 2007, 2007 IEEE Workshop on Machine Learning for Signal Processing.

[20]  R. Tibshirani,et al.  PATHWISE COORDINATE OPTIMIZATION , 2007, 0708.1485.

[21]  Guillermo Sapiro,et al.  Supervised Dictionary Learning , 2008, NIPS.

[22]  Martial Hebert,et al.  Discriminative Sparse Image Models for Class-Specific Edge Detection and Image Interpretation , 2008, ECCV.

[23]  David M. Bradley,et al.  Differentiable Sparse Coding , 2008, NIPS.

[24]  Lucio Soibelman,et al.  Learning Systems for Electric Consumption of Buildings , 2009 .

[25]  Barbara T. Fichman Annual Energy Review 2009 , 2010 .

[26]  R. Tibshirani,et al.  A note on the group lasso and a sparse group lasso , 2010, 1001.0736.

[27]  D. Archer Global Warming: Understanding the Forecast , 2011 .