Back to Search Start Over

Emotion classification with multi‐modal physiological signals using multi‐attention‐based neural network

Authors :
Chengsheng Zou
Zhen Deng
Bingwei He
Maosong Yan
Jie Wu
Zhaoju Zhu
Source :
Cognitive Computation and Systems, Vol 6, Iss 1-3, Pp 1-11 (2024)
Publication Year :
2024
Publisher :
Wiley, 2024.

Abstract

Abstract The ability to effectively classify human emotion states is critically important for human‐computer or human‐robot interactions. However, emotion classification with physiological signals is still a challenging problem due to the diversity of emotion expression and the characteristic differences in different modal signals. A novel learning‐based network architecture is presented that can exploit four‐modal physiological signals, electrocardiogram, electrodermal activity, electromyography, and blood volume pulse, and make a classification of emotion states. It features two kinds of attention modules, feature‐level, and semantic‐level, which drive the network to focus on the information‐rich features by mimicking the human attention mechanism. The feature‐level attention module encodes the rich information of each physiological signal. While the semantic‐level attention module captures the semantic dependencies among modals. The performance of the designed network is evaluated with the open‐source Wearable Stress and Affect Detection dataset. The developed emotion classification system achieves an accuracy of 83.88%. Results demonstrated that the proposed network could effectively process four‐modal physiological signals and achieve high accuracy of emotion classification.

Details

Language :
English
ISSN :
25177567
Volume :
6
Issue :
1-3
Database :
Directory of Open Access Journals
Journal :
Cognitive Computation and Systems
Publication Type :
Academic Journal
Accession number :
edsdoj.528d715e4a3d445eb64efcbfd0295842
Document Type :
article
Full Text :
https://doi.org/10.1049/ccs2.12107