Multiple agents and reinforcement learning for modelling charging loads of electric taxis

Abstract The charging load modelling of electric vehicles (EVs) is of great importance for safe and stable operation of power systems. However, it is difficult to use the traditional Monte Carlo method and mathematical optimization methods to establish a detailed and precise charging load model for EVs in both the temporal and spatial scales, especially for plug-in electric taxis (PETs) due to its strong random characteristics and complex operation behaviors. In order to solve this problem, multiple agents and the multi-step Q( λ ) learning are utilized to model the charging loads of PETs in both the temporal and spatial scales. Firstly, a multi-agent framework is developed based on java agent development framework (JADE), and a variety of agents are built to simulate the operation related players, as well as the operational environment. Then, the multi-step Q( λ ) learning is developed for PET Agents to make decisions under various situations and its performances are compared with the Q-learning. Simulation results illustrate that the proposed framework is able to dynamically simulate the PET daily operation and to obtain the charging loads of PETs in both the temporal and spatial scales. The multi-step Q( λ ) learning outperforms Q-learning in terms of convergence rate and reward performance. Moreover, the PET shift strategies and electricity pricing mechanisms are investigated, and the results indicate that the appropriate operation rules of PETs significantly improve the safe and reliable operation of power systems.

[1]  Tao Yu,et al.  Approximate ideal multi-objective solution Q(λ) learning for optimal carbon-energy combined-flow in multi-energy power systems , 2015 .

[2]  Henrik Madsen,et al.  Inhomogeneous Markov Models for Describing Driving Patterns , 2017, IEEE Transactions on Smart Grid.

[3]  Panagiotis Papadopoulos,et al.  Management of electric vehicle battery charging in distribution networks with multi-agent systems , 2014 .

[4]  Yunfei Mu,et al.  A Spatial–Temporal model for grid impact analysis of plug-in electric vehicles ☆ , 2014 .

[5]  Christoph M. Flath,et al.  Impact of electric vehicles on distribution substations: A Swiss case study , 2015 .

[6]  Minggao Ouyang,et al.  Prospects for Chinese electric vehicle technologies in 2016–2020: Ambition and rationality , 2017 .

[7]  Xiao-Ping Zhang,et al.  Modeling of Plug-in Hybrid Electric Vehicle Charging Demand in Probabilistic Power Flow Calculations , 2012, IEEE Transactions on Smart Grid.

[8]  Nikos D. Hatziargyriou,et al.  A Multi-Agent System for Controlled Charging of a Large Population of Electric Vehicles , 2013, IEEE Transactions on Power Systems.

[9]  Erotokritos Xydas,et al.  A multi-agent based scheduling algorithm for adaptive electric vehicles charging , 2016 .

[10]  Q. Henry Wu,et al.  Function optimisation by learning automata , 2013, Inf. Sci..

[11]  Q. Henry Wu,et al.  Multi-agent learning for routing control within an Internet environment , 2004, Eng. Appl. Artif. Intell..

[12]  Peter Stone,et al.  Reinforcement Learning for RoboCup Soccer Keepaway , 2005, Adapt. Behav..

[13]  Sungwoo Bae,et al.  Electric vehicle charging demand forecasting model based on big data technologies , 2016 .

[14]  Budhendra L. Bhaduri,et al.  A Multi Agent-Based Framework for Simulating Household PHEV Distribution and Electric Distribution Network Impact , 2010 .

[15]  Nick Jenkins,et al.  Coordination of the Charging of Electric Vehicles Using a Multi-Agent System , 2013, IEEE Transactions on Smart Grid.

[16]  Hongcai Zhang,et al.  Pricing mechanisms design for guiding electric vehicle charging to fill load valley , 2016 .

[17]  Lai Tu,et al.  Real-Time Charging Station Recommendation System for Electric-Vehicle Taxis , 2016, IEEE Transactions on Intelligent Transportation Systems.

[18]  Alexis Kwasinski,et al.  Spatial and Temporal Model of Electric Vehicle Charging Demand , 2012, IEEE Transactions on Smart Grid.

[19]  Liyong Niu,et al.  Coordinated Charging Strategy for Electric Taxis in Temporal and Spatial Scale , 2015 .

[20]  Phil Blythe,et al.  A probabilistic approach to combining smart meter and electric vehicle charging data to investigate distribution network impacts , 2015 .

[21]  Yang Yang,et al.  Charging demand for electric vehicle based on stochastic analysis of trip chain , 2016 .

[22]  Xi Chen,et al.  Profit Maximization for Plug-In Electric Taxi With Uncertain Future Electricity Prices , 2014, IEEE Transactions on Power Systems.

[23]  Richard S. Sutton,et al.  Reinforcement Learning: An Introduction , 1998, IEEE Trans. Neural Networks.

[24]  Yue Yuan,et al.  Modeling of Load Demand Due to EV Battery Charging in Distribution Systems , 2011, IEEE Transactions on Power Systems.

[25]  Zaiyue Yang,et al.  Optimal charging strategy for plug-in electric taxi with time-varying profits , 2015, 2015 IEEE Power & Energy Society General Meeting.

[26]  Jing Peng,et al.  Incremental multi-step Q-learning , 1994, Machine Learning.

[27]  Xue Fei Design and Implementation of Simulation Software for Electric Vehicle Charging Behavior Based on Multi-agent System , 2012 .

[28]  P Frías,et al.  Assessment of the Impact of Plug-in Electric Vehicles on Distribution Networks , 2011, IEEE Transactions on Power Systems.

[29]  Tao Yu,et al.  Stochastic Optimal CPS Relaxed Control Methodology for Interconnected Power Systems Using Q-Learning Method , 2011 .

[30]  Henrik Madsen,et al.  Optimal charging of an electric vehicle using a Markov decision process , 2013, 1310.6926.

[31]  Jonghun Park,et al.  A Multiagent Approach to $Q$-Learning for Daily Stock Trading , 2007, IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans.

[32]  Yuki Kudoh,et al.  Environmental evaluation of introducing electric vehicles using a dynamic traffic-flow model , 2001 .

[33]  Jianhua Zhang,et al.  Centralized bi-level spatial-temporal coordination charging strategy for area electric vehicles , 2015 .