Back to Search Start Over

Shared-Private Memory Networks For Multimodal Sentiment Analysis.

Authors :
Zhao, Xianbing
Chen, Yinxin
Liu, Sicen
Tang, Buzhou
Source :
IEEE Transactions on Affective Computing; Oct-Dec2023, Vol. 14 Issue 4, p2889-2900, 12p
Publication Year :
2023

Abstract

Text, visual, and acoustic are usually complementary in the Multimodal Sentiment Analysis (MSA) task. However, current methods primarily concern shared representations while neglecting the critical private aspects of data within individual modalities. In this work, we propose shared-private memory networks based on the recent advances in the attention mechanism, called SPMN, to decouple multimodal representation from shared and private perspectives. It contains three components: a) a shared memory to learn the shared representations of multimodal data; b) three private memories to learn the private representations of individual modalities, respectively; c) and adaptive fusion gates to fuse multimodal private and shared representations. To evaluate the effectiveness of SPMN, we integrate it into different pre-trained language representation models, such as BERT and XLNET, and conduct experiments on two public datasets, CMU-MOSI and CMU-MOSEI. Experimental results indicate that the performances of pre-trained language representation models are significantly improved because of SPMN and demonstrate the superiority of our model compared to the state-of-the-art methods. SPMN's source code is publicly available at: https://github.com/xiaobaicaihhh/SPMN. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
19493045
Volume :
14
Issue :
4
Database :
Complementary Index
Journal :
IEEE Transactions on Affective Computing
Publication Type :
Academic Journal
Accession number :
173946083
Full Text :
https://doi.org/10.1109/TAFFC.2022.3222023