Forecasting the Evolution of Hydropower Generation

Hydropower is the largest renewable energy source for electricity generation in the world, with numerous benefits in terms of: environment protection (near-zero air pollution and climate impact), cost-effectiveness (long-term use, without significant impacts of market fluctuation), and reliability (quickly respond to surge in demand). However, the effectiveness of hydropower plants is affected by multiple factors such as reservoir capacity, rainfall, temperature and fluctuating electricity demand, and particularly their complicated relationships, which make the prediction/recommendation of station operational output a difficult challenge. In this paper, we present DeepHydro, a novel stochastic method for modeling multivariate time series (e.g., water inflow/outflow and temperature) and forecasting power generation of hydropower stations. DeepHydro captures temporal dependencies in co-evolving time series with a new conditioned latent recurrent neural networks, which not only considers the hidden states of observations but also preserves the uncertainty of latent variables. We introduce a generative network parameterized on a continuous normalizing flow to approximate the complex posterior distribution of multivariate time series data, and further use neural ordinary differential equations to estimate the continuous-time dynamics of the latent variables constituting the observable data. This allows our model to deal with the discrete observations in the context of continuous dynamic systems, while being robust to the noise. We conduct extensive experiments on real-world datasets from a large power generation company consisting of cascade hydropower stations. The experimental results demonstrate that the proposed method can effectively predict the power production and significantly outperform the possible candidate baseline approaches.

[1]  Ole Winther,et al.  Sequential Neural Models with Stochastic Layers , 2016, NIPS.

[2]  Matthias W. Seeger,et al.  Deep State Space Models for Time Series Forecasting , 2018, NeurIPS.

[3]  Ludovic Denoyer,et al.  Spatio-Temporal Neural Networks for Space-Time Series Forecasting and Relations Discovery , 2017, 2017 IEEE International Conference on Data Mining (ICDM).

[4]  Justin Bayer,et al.  Variational Inference for On-line Anomaly Detection in High-Dimensional Time Series , 2016, ArXiv.

[5]  Jürgen Schmidhuber,et al.  Long Short-Term Memory , 1997, Neural Computation.

[6]  Ning Xia,et al.  Deep r -th Root of Rank Supervised Joint Binary Embedding for Multivariate Time Series Retrieval , 2018, KDD.

[7]  Valentino Constantinou,et al.  Detecting Spacecraft Anomalies Using LSTMs and Nonparametric Dynamic Thresholding , 2018, KDD.

[8]  Hyun Ah Song,et al.  PowerCast: Mining and Forecasting Power Grid Sequences , 2017, ECML/PKDD.

[9]  Sepp Hochreiter,et al.  Fast and Accurate Deep Network Learning by Exponential Linear Units (ELUs) , 2015, ICLR.

[10]  Qiao Liu,et al.  STAMP: Short-Term Attention/Memory Priority Model for Session-based Recommendation , 2018, KDD.

[11]  Max Welling,et al.  Auto-Encoding Variational Bayes , 2013, ICLR.

[12]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[13]  Xueqi Cheng,et al.  NeuCast: Seasonal Neural Forecast of Power Grid Time Series , 2018, IJCAI.

[14]  Junjie Wu,et al.  Multilevel Wavelet Decomposition Network for Interpretable Time Series Analysis , 2018, KDD.

[15]  Shakir Mohamed,et al.  Variational Inference with Normalizing Flows , 2015, ICML.

[16]  Sander Bohte,et al.  Conditional Time Series Forecasting with Convolutional Neural Networks , 2017, 1703.04691.

[17]  Wei Sun,et al.  Robust Anomaly Detection for Multivariate Time Series through Stochastic Recurrent Neural Network , 2019, KDD.

[18]  David Duvenaud,et al.  Neural Ordinary Differential Equations , 2018, NeurIPS.

[19]  Hsiang-Fu Yu,et al.  Think Globally, Act Locally: A Deep Neural Network Approach to High-Dimensional Time Series Forecasting , 2019, NeurIPS.

[20]  Shay B. Cohen,et al.  Stock Movement Prediction from Tweets and Historical Prices , 2018, ACL.

[21]  Juan Chen,et al.  A multi-time-scale power prediction model of hydropower station considering multiple uncertainties. , 2019, The Science of the total environment.

[22]  Yong Zhang,et al.  Hierarchical Electricity Time Series Forecasting for Integrating Consumption Patterns Analysis and Aggregation Consistency , 2018, IJCAI.

[23]  David S. Rosenblum,et al.  UrbanFM: Inferring Fine-Grained Urban Flows , 2019, KDD.

[24]  Zhang Xiong,et al.  AlphaStock: A Buying-Winners-and-Selling-Losers Investment Strategy using Interpretable Deep Reinforcement Attention Networks , 2019, KDD.

[25]  Heri Ramampiaro,et al.  Forecasting Intra-Hour Imbalances in Electric Power Systems , 2019, AAAI.

[26]  Yi Pan,et al.  Multi-Horizon Time Series Forecasting with Temporal Attention Learning , 2019, KDD.

[27]  Yang Feng,et al.  Unsupervised Anomaly Detection via Variational Auto-Encoder for Seasonal KPIs in Web Applications , 2018, WWW.

[28]  Goce Trajcevski,et al.  Session-based recommendation via flow-based deep generative networks and Bayesian inference , 2020, Neurocomputing.

[29]  Yoshua Bengio,et al.  Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling , 2014, ArXiv.

[30]  David Duvenaud,et al.  Latent Ordinary Differential Equations for Irregularly-Sampled Time Series , 2019, NeurIPS.

[31]  Eric P. Xing,et al.  Toward Controlled Generation of Text , 2017, ICML.

[32]  J. Dormand,et al.  A family of embedded Runge-Kutta formulae , 1980 .

[33]  Hyun Ah Song,et al.  StreamCast: Fast and Online Mining of Power Grid Time Sequences , 2018, SDM.

[34]  Junbo Zhang,et al.  Urban Traffic Prediction from Spatio-Temporal Data Using Deep Meta Learning , 2019, KDD.

[35]  Eric Nalisnick,et al.  Normalizing Flows for Probabilistic Modeling and Inference , 2019, J. Mach. Learn. Res..