Hybrid Electric Vehicle Energy Management With Computer Vision and Deep Reinforcement Learning

Modern automotive systems have been equipped with a highly increasing number of onboard computer vision hardware and software, which are considered to be beneficial for achieving eco-driving. This article combines computer vision and deep reinforcement learning (DRL) to improve the fuel economy of hybrid electric vehicles. The proposed method is capable of autonomously learning the optimal control policy from visual inputs. The state-of-the-art convolutional neural networks-based object detection method is utilized to extract available visual information from onboard cameras. The detected visual information is used as a state input for a continuous DRL model to output energy management strategies. To evaluate the proposed method, we construct 100 km real city and highway driving cycles, in which visual information is incorporated. The results show that the DRL-based system with visual information consumes 4.3–8.8% less fuel compared with the one without visual information, and the proposed method achieves 96.5% fuel economy of the global optimum-dynamic programming.

[1]  Shaobo Xie,et al.  A Data-Driven Power Management Strategy for Plug-In Hybrid Electric Vehicles Including Optimal Battery Depth of Discharging , 2020, IEEE Transactions on Industrial Informatics.

[2]  Mohsen Guizani,et al.  Deep CNN-Based Real-Time Traffic Light Detector for Self-Driving Vehicles , 2020, IEEE Transactions on Mobile Computing.

[3]  Andreas Geiger,et al.  Computer Vision for Autonomous Vehicles: Problems, Datasets and State-of-the-Art , 2017, Found. Trends Comput. Graph. Vis..

[4]  Jiankun Peng,et al.  Energy management of hybrid electric bus based on deep reinforcement learning in continuous state and action space , 2019, Energy Conversion and Management.

[5]  Hongwen He,et al.  Deep Reinforcement Learning-Based Energy Management for a Series Hybrid Electric Vehicle Enabled by History Cumulative Trip Information , 2019, IEEE Transactions on Vehicular Technology.

[6]  Yechen Qin,et al.  Online Energy Management for Multimode Plug-In Hybrid Electric Vehicles , 2019, IEEE Transactions on Industrial Informatics.

[7]  Lei Zhu,et al.  An Automated Vehicle Fuel Economy Benefits Evaluation Framework Using Real-World Travel and Traffic Data , 2019, IEEE Intelligent Transportation Systems Magazine.

[8]  He Hongwen,et al.  A novel MPC-based adaptive energy management strategy in plug-in hybrid electric vehicles , 2019, Energy.

[9]  Hongwen He,et al.  Continuous reinforcement learning of energy management with deep Q network for a power split hybrid electric bus , 2018, Applied Energy.

[10]  Weiwen Deng,et al.  Stochastic Control of Predictive Power Management for Battery/Supercapacitor Hybrid Energy Storage Systems of Electric Vehicles , 2018, IEEE Transactions on Industrial Informatics.

[11]  Sun Chao,et al.  Real-time global driving cycle construction and the application to economy driving pro system in plug-in hybrid electric vehicles , 2018, Energy.

[12]  Ali Farhadi,et al.  YOLOv3: An Incremental Improvement , 2018, ArXiv.

[13]  Jianqiang Wang,et al.  Object Classification Using CNN-Based Fusion of Vision and LIDAR in Autonomous Vehicle Environment , 2018, IEEE Transactions on Industrial Informatics.

[14]  Marcin Andrychowicz,et al.  Parameter Space Noise for Exploration , 2017, ICLR.

[15]  Petros A. Ioannou,et al.  Combined Variable Speed Limit and Lane Change Control for Highway Traffic , 2017, IEEE Transactions on Intelligent Transportation Systems.

[16]  Bo Gao,et al.  Energy Management in Plug-in Hybrid Electric Vehicles: Recent Progress and a Connected Vehicles Perspective , 2017, IEEE Transactions on Vehicular Technology.

[17]  Yan Zhang,et al.  Enabling Localized Peer-to-Peer Electricity Trading Among Plug-in Hybrid Electric Vehicles Using Consortium Blockchains , 2017, IEEE Transactions on Industrial Informatics.

[18]  Yanjun Huang,et al.  Model predictive control power management strategies for HEVs: A review , 2017 .

[19]  Junqiang Xi,et al.  Real-Time Energy Management Strategy Based on Velocity Forecasts Using V2V and V2I Communications , 2017, IEEE Transactions on Intelligent Transportation Systems.

[20]  Eder Santana,et al.  Learning a Driving Simulator , 2016, ArXiv.

[21]  Yuval Tassa,et al.  Continuous control with deep reinforcement learning , 2015, ICLR.

[22]  J. Karl Hedrick,et al.  Dynamic Traffic Feedback Data Enabled Energy Management in Plug-in Hybrid Electric Vehicles , 2015, IEEE Transactions on Control Systems Technology.

[23]  Guy Lever,et al.  Deterministic Policy Gradient Algorithms , 2014, ICML.

[24]  Thomas H. Bradley,et al.  Review of hybrid, plug-in hybrid, and electric vehicle market modeling Studies , 2013 .

[25]  Tao Chen,et al.  Intelligent Environment-Friendly Vehicles: Concept and Case Studies , 2012, IEEE Transactions on Intelligent Transportation Systems.

[26]  Richard S. Sutton,et al.  Reinforcement Learning: An Introduction , 1998, IEEE Trans. Neural Networks.

[27]  R. H. Staunton,et al.  Evaluation of 2004 Toyota Prius Hybrid Electric Drive System , 2004 .

[28]  Laura Sacerdote,et al.  The Ornstein–Uhlenbeck neuronal model with signal-dependent noise , 2001 .