1. Multi-view Time-frequency Contrastive Learning for Emotion Recognition
- Author
-
Wang, Lei, Zhu, Jianping, Jin, Bo, and Wei, XiaoPeng
- Subjects
Cognitive Neuroscience ,Computer Science ,Emotion Perception ,Electroencephalography (EEG) ,Neural Networks - Abstract
Electroencephalogram (EEG) signals are physiological indicators of brain activity, offering the advantage of high temporal resolution for capturing subtle emotional changes and providing rich information for emotion recognition. However, extracting effective features from EEG data with a low signal-to-noise ratio poses a significant challenge that hinders progress in this research field. To address this issue, we propose a multi-view time-frequency contrastive learning framework called MV-TFCL to enhance the information representation capability of EEG signals from multiple perspectives. Firstly, we introduce a recursive neural network based on multi-scale time-frequency consistency, which integrates global semantic information across different scales through gated units. To our knowledge, this is the first proposal of the theory of multi-scale time-frequency consistency applied in emotion recognition research. Subsequently, we design a tree-structured time-frequency encoder to capture local semantic information within the time-frequency domain. Finally, we incorporate semantic consistency constraints from both global and local perspectives to learn more generalizable and robust features. Extensive experimental results on two publicly available datasets demonstrate the effectiveness and superiority of our proposed method.
- Published
- 2024