Back to Search Start Over

Curriculum Learning Meets Directed Acyclic Graph for Multimodal Emotion Recognition

Authors :
Nguyen, Cam-Van Thi
Nguyen, Cao-Bach
Ha, Quang-Thuy
Le, Duc-Trong
Publication Year :
2024

Abstract

Emotion recognition in conversation (ERC) is a crucial task in natural language processing and affective computing. This paper proposes MultiDAG+CL, a novel approach for Multimodal Emotion Recognition in Conversation (ERC) that employs Directed Acyclic Graph (DAG) to integrate textual, acoustic, and visual features within a unified framework. The model is enhanced by Curriculum Learning (CL) to address challenges related to emotional shifts and data imbalance. Curriculum learning facilitates the learning process by gradually presenting training samples in a meaningful order, thereby improving the model's performance in handling emotional variations and data imbalance. Experimental results on the IEMOCAP and MELD datasets demonstrate that the MultiDAG+CL models outperform baseline models. We release the code for MultiDAG+CL and experiments: https://github.com/vanntc711/MultiDAG-CL<br />Comment: Accepted by LREC-COLING 2024

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2402.17269
Document Type :
Working Paper