Energy Storage Management via Deep Q-Networks

Energy storage devices represent environmentally friendly candidates to cope with volatile renewable energy generation. Motivated by the increase in privately owned storage systems, this paper studies the problem of real-time control of a storage unit co-located with a renewable energy generator and an inelastic load. Unlike many approaches in the literature, no distributional assumptions are being made on the renewable energy generation or the real-time prices. Building on the deep Q-networks algorithm, a reinforcement learning approach utilizing a neural network is devised where the storage unit operational constraints are respected. The neural network approximates the action-value function which dictates what action (charging, discharging, etc.) to take. Simulations indicate that near-optimal performance can be attained with the proposed learning-based control policy for the storage units.

[1]  Richard S. Sutton,et al.  Reinforcement Learning: An Introduction , 1998, IEEE Trans. Neural Networks.

[2]  J. Trancik,et al.  Value of storage technologies for wind and solar energy , 2016 .

[3]  Luis S. Vargas,et al.  Wind power curtailment and energy storage in transmission congestion management considering power plants ramp rates , 2015, 2015 IEEE Power & Energy Society General Meeting.

[4]  Yuguang Fang,et al.  Cutting Down Electricity Cost in Internet Data Centers by Using Energy Storage , 2011, 2011 IEEE Global Telecommunications Conference - GLOBECOM 2011.

[5]  Federico Silvestro,et al.  Optimal Management Strategy of a Battery-Based Storage System to Improve Renewable Energy Integration in Distribution Networks , 2012, IEEE Transactions on Smart Grid.

[6]  Alex Graves,et al.  Playing Atari with Deep Reinforcement Learning , 2013, ArXiv.

[7]  Pengwei Du,et al.  Sizing Energy Storage to Accommodate High Penetration of Variable Energy Resources , 2012, IEEE Transactions on Sustainable Energy.

[8]  Lang Tong,et al.  Optimal Operation and Economic Value of Energy Storage at Consumer Locations , 2017, IEEE Transactions on Automatic Control.

[9]  R. Sioshansi Welfare Impacts of Electricity Storage and the Implications of Ownership Structure , 2010 .

[10]  J. Bank,et al.  Development of a High Resolution, Real Time, Distribution-Level Metering System and Associated Visualization, Modeling, and Data Analysis Functions , 2013 .

[11]  Ufuk Topcu,et al.  Optimal power flow with large-scale storage integration , 2013, IEEE Transactions on Power Systems.

[12]  L. Soder,et al.  Modeling Real-Time Balancing Power Market Prices Using Combined SARIMA and Markov Processes , 2008, IEEE Transactions on Power Systems.

[13]  Jie Xu,et al.  Real-Time Energy Storage Management for Renewable Integration in Microgrid: An Off-Line Optimization Approach , 2015, IEEE Transactions on Smart Grid.

[14]  Long-Ji Lin,et al.  Reinforcement learning for robots using neural networks , 1992 .

[15]  Shane Legg,et al.  Human-level control through deep reinforcement learning , 2015, Nature.

[16]  Jay F. Whitacre,et al.  The economics of using plug-in hybrid electric vehicle battery packs for grid storage , 2010 .

[17]  A. Lamont Assessing the economic value and optimal structure of large-scale electricity storage , 2013, IEEE Transactions on Power Systems.

[18]  Geoffrey E. Hinton,et al.  ImageNet classification with deep convolutional neural networks , 2012, Commun. ACM.

[19]  Y. M. Atwa,et al.  Optimal Allocation of ESS in Distribution Systems With a High Penetration of Wind Energy , 2010, IEEE Transactions on Power Systems.

[20]  Emiliano Dall'Anese,et al.  Optimal Distributed Energy Storage Management Using Relaxed Dantzig-Wolfe Decomposition , 2018, 2018 IEEE Conference on Decision and Control (CDC).

[21]  Daniel S. Kirschen,et al.  Near-Optimal Method for Siting and Sizing of Distributed Storage in a Transmission Network , 2015, IEEE Transactions on Power Systems.