1. Battery Health-Aware and Deep Reinforcement Learning-Based Energy Management for Naturalistic Data-Driven Driving Scenarios
- Author
-
Lech M. Grzesiak, Xiaosong Hu, Xianke Lin, Jieming Zhang, Dawei Pi, and Xiaolin Tang
- Subjects
Battery (electricity) ,Computer science ,Energy management ,Stability (learning theory) ,Energy Engineering and Power Technology ,Transportation ,Traffic flow ,Reliability engineering ,Data-driven ,Dynamic programming ,Automotive Engineering ,Fuel efficiency ,Reinforcement learning ,Electrical and Electronic Engineering - Abstract
This paper proposes a battery health-aware and deep reinforcement learning (DRL)-based energy management framework for power-split hybrid electric vehicles in a naturalistic driving scenario. First, based on the data collected from the actual traffic flow, a data-driven method is used to establish driving scenarios that reflect different driving patterns and behaviors. Second, the expert knowledge is embedded into the deep deterministic policy gradient (DDPG) to achieve faster convergence with the guaranteed vehicle performance. Third, the superiority of the control strategy is achieved by optimizing the trade-off among fuel consumption, battery aging cost and SOC sustainability penalty under different weight coefficients, and verified by comparison with the existing state-of-the-art strategies including the deep Q-network (DQN) and dynamic programming (DP). The results show that the proposed strategy can slow down battery aging by lowering the operating severity factor with minimal fuel economy penalty while remaining accelerated iterative convergence compared with DQN. The benefits of proposed strategy become very evident when the vehicle is driving under the high power demand and it has good stability to cope with the change of operating conditions.
- Published
- 2022
- Full Text
- View/download PDF