Real-time vehicle-to-grid control algorithm under price uncertainty

The vehicle-to-grid (V2G) system enables energy flow from the electric vehicles (EVs) to the grid. The distributed power of the EVs can either be sold to the grid or be used to provide frequency regulation service when V2G is implemented. A V2G control algorithm is necessary to decide whether the EV should be charged, discharged, or provide frequency regulation service in each hour. The V2G control problem is further complicated by the price uncertainty, where the electricity price is determined dynamically every hour. In this paper, we study the real-time V2G control problem under price uncertainty. We model the electricity price as a Markov chain with unknown transition probabilities and formulate the problem as a Markov decision process (MDP). This model features implicit estimation of the impact of future electricity prices and current control operation on long-term profits. The Q-learning algorithm is then used to adapt the control operation to the hourly available price in order to maximize the profit for the EV owner during the whole parking time. We evaluate our proposed V2G control algorithm using both the simulated price and the actual price from PJM in 2010. Simulation results show that our proposed algorithm can work effectively in the real electricity market and it is able to increase the profit significantly compared with the conventional EV charging scheme.

[1]  Juan M. Morales,et al.  Real-Time Demand Response Model , 2010, IEEE Transactions on Smart Grid.

[2]  Sean P. Meyn,et al.  The O.D.E. Method for Convergence of Stochastic Approximation and Reinforcement Learning , 2000, SIAM J. Control. Optim..

[3]  H. Allcott,et al.  Real Time Pricing and Electricity Markets , 2009 .

[4]  Dimitri P. Bertsekas,et al.  Dynamic Programming and Optimal Control, Two Volume Set , 1995 .

[5]  Willett Kempton,et al.  A Test of Vehicle-to-Grid (V2G) for Energy Storage and Frequency Regulation in the PJM , 2009 .

[6]  Sekyung Han,et al.  Development of an Optimal Vehicle-to-Grid Aggregator for Frequency Regulation , 2010, IEEE Transactions on Smart Grid.

[7]  S. Holland,et al.  The Short-Run Effects of Time-Varying Prices in Competitive Electricity Markets , 2006 .

[8]  Alec N. Brooks,et al.  Vehicle-to-grid demonstration project: grid regulation ancillary service with a battery electric vehicle. , 2002 .

[9]  Richard S. Sutton,et al.  Reinforcement Learning: An Introduction , 1998, IEEE Trans. Neural Networks.

[10]  Willett Kempton,et al.  Vehicle-to-grid power fundamentals: Calculating capacity and net revenue , 2005 .

[11]  Willett Kempton,et al.  Using fleets of electric-drive vehicles for grid support , 2007 .

[12]  S. Borenstein The Long-Run Efficiency of Real-Time Electricity Pricing , 2005 .

[13]  Vincent W. S. Wong,et al.  Autonomous Demand-Side Management Based on Game-Theoretic Energy Consumption Scheduling for the Future Smart Grid , 2010, IEEE Transactions on Smart Grid.

[14]  G. Gross,et al.  Design of a Conceptual Framework for the V2G Implementation , 2008, 2008 IEEE Energy 2030 Conference.

[15]  Willett Kempton,et al.  Vehicle-to-grid power implementation: From stabilizing the grid to supporting large-scale renewable energy , 2005 .

[16]  Marco Levorato,et al.  Residential Demand Response Using Reinforcement Learning , 2010, 2010 First IEEE International Conference on Smart Grid Communications.

[17]  A.M. Gonzalez,et al.  Modeling and forecasting electricity prices with input/output hidden Markov models , 2005, IEEE Transactions on Power Systems.