Deep Reinforcement Scheduling of Energy Storage Systems for Real-Time Voltage Regulation in Unbalanced LV Networks With High PV Penetration

The ever-growing higher penetration of distributed energy resources (DERs) in low-voltage (LV) distribution systems brings both opportunities and challenges to voltage support and regulation. This paper proposes a deep reinforcement learning (DRL)-based scheduling scheme of energy storage systems (ESSs) to mitigate system voltage deviations in unbalanced LV distribution networks. The ESS-based voltage regulation problem is formulated as a multi-stage quadratic stochastic program, with the objective of minimizing the expected total daily voltage regulation cost while satisfying operational constraints. While existing voltage regulation methods are mostly focused on one-time-step control, this paper explores a day-horizon system-wide voltage regulation problem. In other words, the size of action and state spaces are extremely high-dimensional and need to be delicately handled. Furthermore, in order to overcome the difficulty of modeling uncertainties and develop a real-time solution, a learn-to-schedule feedback control framework is proposed by adapting the problem to a model-free DRL setting. The proposed algorithm is tested on a customized 6-bus system and a modified IEEE 34-bus system. Simulation results validate the effectiveness and near-optimality of voltage regulation by ESS in comparison with a deterministic quadratic program solution.