Back to Search Start Over

Driver Digital Twin for Online Recognition of Distracted Driving Behaviors

Authors :
Ma, Yunsheng
Du, Runjia
Abdelraouf, Amr
Han, Kyungtae
Gupta, Rohit
Wang, Ziran
Source :
IEEE Transactions on Intelligent Vehicles; February 2024, Vol. 9 Issue: 2 p3168-3180, 13p
Publication Year :
2024

Abstract

Deep learning has been widely utilized in intelligent vehicle systems, particularly in the field of driver distraction detection. However, existing methods in this application tend to focus solely on appearance or cognitive state as indicators of distraction, while neglecting the significance of temporal modeling in accurately identifying driver actions. This oversight can result in limitations such as difficulty in comprehending context, incapability to recognize gradual changes, and failure to capture complex behaviors. To address these limitations, this paper introduces a new framework based on the concept of Driver Digital Twin (DDT). The DDT framework serves as a digital replica of the driver, capturing their naturalistic driving data and behavioral models. It consists of a transformer-based driver action recognition module and a novel temporal localization module to detect distracted behaviors. Additionally, we propose a pseudo-labeled multi-task learning algorithm that includes driver emotion recognition as supplementary information for recognizing distractions. We have validated the effectiveness of our approach using three publicly available driver distraction detection benchmarks: SFDDD, AUCDD, and SynDD2. The results demonstrate that our framework achieves state-of-the-art performance in both driver action recognition and temporal localization tasks. It outperforms the leading methods by 6.5 and 0.9 percentage points on SFDDD and AUCDD, respectively. Furthermore, it ranks in the top 5% on the SynDD2 leaderboard.

Details

Language :
English
ISSN :
23798858
Volume :
9
Issue :
2
Database :
Supplemental Index
Journal :
IEEE Transactions on Intelligent Vehicles
Publication Type :
Periodical
Accession number :
ejs66238561
Full Text :
https://doi.org/10.1109/TIV.2024.3353253