Deep Reinforcement Learning-Based Energy Management for a Series Hybrid Electric Vehicle Enabled by History Cumulative Trip Information

It is essential to develop proper energy management strategies (EMSs) with broad adaptability for hybrid electric vehicles (HEVs). This paper utilizes deep reinforcement learning (DRL) to develop EMSs for a series HEV due to DRL's advantages of requiring no future driving information in derivation and good generalization in solving energy management problem formulated as a Markov decision process. History cumulative trip information is also integrated for effective state of charge guidance in DRL-based EMSs. The proposed method is systematically introduced from offline training to online applications; its learning ability, optimality, and generalization are validated by comparisons with fuel economy benchmark optimized by dynamic programming, and real-time EMSs based on model predictive control (MPC). Simulation results indicate that without a priori knowledge of future trip, original DRL-based EMS achieves an average 3.5% gap from benchmark, superior to MPC-based EMS with accurate prediction; after further applying output frequency adjustment, a mean gap of 8.7%, which is comparable with MPC-based EMS with mean prediction error of 1 m/s, is maintained with concurrently noteworthy improvement in reducing engine start times. Besides, its impressive computation speed of about 0.001 s per simulation step proves its practical application potential, and this method is independent of powertrain topology such that it is applicative for any type of HEVs even when future driving information is unavailable.

[1]  Hongwen He,et al.  Continuous reinforcement learning of energy management with deep Q network for a power split hybrid electric bus , 2018, Applied Energy.

[2]  Olle Sundström,et al.  A generic dynamic programming Matlab function , 2009, 2009 IEEE Control Applications, (CCA) & Intelligent Control, (ISIC).

[3]  Zheng Chen,et al.  Energy Management for a Power-Split Plug-in Hybrid Electric Vehicle Based on Dynamic Programming and Neural Networks , 2014, IEEE Transactions on Vehicular Technology.

[4]  Stefano Di Cairano,et al.  MPC-Based Energy Management of a Power-Split Hybrid Electric Vehicle , 2012, IEEE Transactions on Control Systems Technology.

[5]  Jian Sun,et al.  Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification , 2015, 2015 IEEE International Conference on Computer Vision (ICCV).

[6]  Hong Wang,et al.  A novel energy management for hybrid off-road vehicles without future driving cycles as a priori , 2017 .

[7]  Junqiang Xi,et al.  Real-Time Energy Management Strategy Based on Velocity Forecasts Using V2V and V2I Communications , 2017, IEEE Transactions on Intelligent Transportation Systems.

[8]  Yanjun Huang,et al.  Model predictive control power management strategies for HEVs: A review , 2017 .

[9]  Ke Zhang,et al.  Residual Networks of Residual Networks: Multilevel Residual Networks , 2016, IEEE Transactions on Circuits and Systems for Video Technology.

[10]  Yuan Zou,et al.  Reinforcement Learning of Adaptive Energy Management With Transition Probability for a Hybrid Electric Tracked Vehicle , 2015, IEEE Transactions on Industrial Electronics.

[11]  Peter Stone,et al.  Deep Reinforcement Learning in Parameterized Action Space , 2015, ICLR.

[12]  Qie Sun,et al.  Prediction of short-term PV power output and uncertainty analysis , 2018, Applied Energy.

[13]  Chang Liu,et al.  Power management for Plug-in Hybrid Electric Vehicles using Reinforcement Learning with trip information , 2014, 2014 IEEE Transportation Electrification Conference and Expo (ITEC).

[14]  Jiayi Cao,et al.  Reinforcement learning-based real-time power management for hybrid energy storage system in the plug-in hybrid electric vehicle , 2018 .

[15]  Yunpeng Wang,et al.  Long short-term memory neural network for traffic speed prediction using remote microwave sensor data , 2015 .

[16]  Xiaosong Hu,et al.  Energy management strategies of connected HEVs and PHEVs: Recent progress and outlook , 2019, Progress in Energy and Combustion Science.

[17]  Yi Lu Murphey,et al.  Intelligent Hybrid Vehicle Power Control—Part I: Machine Learning of Optimal Vehicle Power , 2012, IEEE Transactions on Vehicular Technology.

[18]  Liang Li,et al.  Time-Efficient Stochastic Model Predictive Energy Management for a Plug-In Hybrid Electric Bus With an Adaptive Reference State-of-Charge Advisory , 2018, IEEE Transactions on Vehicular Technology.

[19]  Sun Chao,et al.  Real-time global driving cycle construction and the application to economy driving pro system in plug-in hybrid electric vehicles , 2018, Energy.

[20]  Hongwen He,et al.  Rule based energy management strategy for a series–parallel plug-in hybrid electric bus optimized by dynamic programming , 2017 .

[21]  He Hongwen,et al.  A novel MPC-based adaptive energy management strategy in plug-in hybrid electric vehicles , 2019, Energy.

[22]  José R. Vázquez-Canteli,et al.  Reinforcement learning for demand response: A review of algorithms and modeling techniques , 2019, Applied Energy.

[23]  Bin Ran,et al.  A hybrid deep learning based traffic flow prediction method and its understanding , 2018 .

[24]  J. Karl Hedrick,et al.  Dynamic Traffic Feedback Data Enabled Energy Management in Plug-in Hybrid Electric Vehicles , 2015, IEEE Transactions on Control Systems Technology.

[25]  Shane Legg,et al.  Human-level control through deep reinforcement learning , 2015, Nature.

[26]  Hongwen He,et al.  An energy management strategy based on stochastic model predictive control for plug-in hybrid electric buses , 2017 .

[27]  Bo Gao,et al.  Energy Management in Plug-in Hybrid Electric Vehicles: Recent Progress and a Connected Vehicles Perspective , 2017, IEEE Transactions on Vehicular Technology.

[28]  Dongpu Cao,et al.  Reinforcement Learning Optimized Look-Ahead Energy Management of a Parallel Hybrid Electric Vehicle , 2017, IEEE/ASME Transactions on Mechatronics.

[29]  Ali Emadi,et al.  Classification and Review of Control Strategies for Plug-In Hybrid Electric Vehicles , 2011, IEEE Transactions on Vehicular Technology.

[30]  Yanjun Huang,et al.  A predictive power management controller for service vehicle anti-idling systems without a priori information , 2016 .

[31]  Jingyu Wang,et al.  Knowledge-Driven Service Offloading Decision for Vehicular Edge Computing: A Deep Reinforcement Learning Approach , 2019, IEEE Transactions on Vehicular Technology.

[32]  Yaser P. Fallah,et al.  Predictive AECMS by Utilization of Intelligent Transportation Systems for Hybrid Electric Vehicle Powertrain Control , 2017, IEEE Transactions on Intelligent Vehicles.

[33]  Lino Guzzella,et al.  On Implementation of Dynamic Programming for Optimal Control Problems with Final State Constraints , 2010 .

[34]  Shaoguo Wen,et al.  Fine-Grained Vehicle Classification With Channel Max Pooling Modified CNNs , 2019, IEEE Transactions on Vehicular Technology.

[35]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[36]  Xiaosong Hu,et al.  Velocity Predictors for Predictive Energy Management in Hybrid Electric Vehicles , 2015, IEEE Transactions on Control Systems Technology.

[37]  Chenming Li,et al.  Energy Management Strategy for a Hybrid Electric Vehicle Based on Deep Reinforcement Learning , 2018 .

[38]  Yoshua Bengio,et al.  Understanding the difficulty of training deep feedforward neural networks , 2010, AISTATS.

[39]  Yuval Tassa,et al.  Continuous control with deep reinforcement learning , 2015, ICLR.