1. Learning to Drive Like Human Beings: A Method Based on Deep Reinforcement Learning.
- Author
-
Tian, Yantao, Cao, Xuanhao, Huang, Kai, Fei, Cong, Zheng, Zhu, and Ji, Xuewu
- Abstract
In this paper, a new framework for path tracking is proposed through learning to drive like human beings. Firstly, the imitation algorithm (behavior cloning) is adopted to initialize the deep reinforcement learning (DRL) algorithm through learning the professional drivers’ experience. Secondly, a continuous, deterministic, model free deep reinforcement learning algorithm is adopted to optimize our DRL model on line through trial and error. By combining behavior cloning and deep reinforcement learning algorithms, the DRL model can learn an effective policy quickly for path tracking using some easy-to-measure vehicle state parameters and environment information as inputs. Actor-Critic structure is adopted in the DRL algorithm. In order to speed up the convergence rate of the DRL model and improve the learning effect, we propose a dual actor networks structure for the two different action outputs (steering wheel angle and vehicle speed), and a chief critic network is built to guide the updating process of dual actor networks at the same time. Based on this dual actor networks structure, we can pick out some more important state information as state inputs for different action outputs. Besides, a kind of reward mechanism is also presented for autonomous driving. Finally, simulation training and experiment test are carried out, and the results confirm that the framework proposed in this paper is more than data efficient than the original algorithm, and the trained DRL model can track the reference path with accuracy and has the generalization ability for different roads. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF