Back to Search Start Over

AGBM: An Adaptive Gradient Balanced Mechanism for the End-to-End Steering Estimation.

Authors :
Yuan, Wei
Zhuang, Hanyang
Wang, Chunxiang
Yang, Ming
Source :
IEEE Transactions on Intelligent Transportation Systems; Sep2022, Vol. 23 Issue 9, p16016-16025, 10p
Publication Year :
2022

Abstract

End-to-end steering estimation is one of the important deep regression tasks. However, driving datasets are always imbalanced on the distribution of the steering value, which makes end-to-end learning models perform bad steering estimation accuracy, especially for the sharp steering value estimation, which finally, leads to an imbalanced training problem. In this paper, the essential reason for the imbalanced training problem for the deep regression task is first revealed as the gradient disharmony. To hedge the gradient disharmony, a general gradient balanced mechanism for the deep regression task is proposed in this paper. Meanwhile, the general gradient balanced mechanism is applied for the steering estimation task. According to the steering distribution, a novel Adaptive Gradient Balanced Mechanism (AGBM) is proposed to hedge the gradient disharmony based on the Laplace distribution with the general gradient balanced mechanism theory. Further, a new loss function embedding with AGBM is proposed to balance the gradient disharmony for the training process. AGBM makes the process of gradient hedging with adaptive style without parameters finetuning. Based on five driving datasets, four experiments are conducted for demonstrations. The experiment results show the AGBM enables the end-to-end models to perform the best accuracy of the steering estimation, including the sharp steering samples. Meanwhile, AGBM is generalized with different end-to-end learning models. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
15249050
Volume :
23
Issue :
9
Database :
Complementary Index
Journal :
IEEE Transactions on Intelligent Transportation Systems
Publication Type :
Academic Journal
Accession number :
159209322
Full Text :
https://doi.org/10.1109/TITS.2022.3147248