Back to Search
Start Over
Research on driver's anger recognition method based on multimodal data fusion.
- Source :
- Traffic Injury Prevention; 2024, Vol. 25 Issue 3, p354-363, 10p
- Publication Year :
- 2024
-
Abstract
- This paper aims to address the challenge of low accuracy in single-modal driver anger recognition by introducing a multimodal driver anger recognition model. The primary objective is to develop a multimodal fusion recognition method for identifying driver anger, focusing on electrocardiographic (ECG) signals and driving behavior signals. Emotion-inducing experiments were performed employing a driving simulator to capture both ECG signals and driving behavioral signals from drivers experiencing both angry and calm moods. An analysis of characteristic relationships and feature extraction was conducted on ECG signals and driving behavior signals related to driving anger. Seventeen effective feature indicators for recognizing driving anger were chosen to construct a dataset for driver anger. A binary classification model for recognizing driving anger was developed utilizing the Support Vector Machine (SVM) algorithm. Multimodal fusion demonstrated significant advantages over single-modal approaches in emotion recognition. The SVM-DS model using decision-level fusion had the highest accuracy of 84.75%. Compared with the driver anger emotion recognition model based on unimodal ECG features, unimodal driving behavior features, and multimodal feature layer fusion, the accuracy increased by 9.10%, 4.15%, and 0.8%, respectively. The proposed multimodal recognition model, incorporating ECG and driving behavior signals, effectively identifies driving anger. The research results provide theoretical and technical support for the establishment of a driver anger system. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 15389588
- Volume :
- 25
- Issue :
- 3
- Database :
- Complementary Index
- Journal :
- Traffic Injury Prevention
- Publication Type :
- Academic Journal
- Accession number :
- 176211388
- Full Text :
- https://doi.org/10.1080/15389588.2023.2297658