1. Various syncretic co‐attention network for multimodal sentiment analysis.
- Author
-
Cao, Meng, Zhu, Yonghua, Gao, Wenjing, Li, Mengyao, and Wang, Shaoxiu
- Subjects
SENTIMENT analysis ,COINTEGRATION ,ATTITUDE (Psychology) ,SOCIAL networks ,SEMANTICS - Abstract
Summary: The multimedia contents shared on social network reveal public sentimental attitudes toward specific events. Therefore, it is necessary to conduct sentiment analysis automatically on abundant multimedia data posted by the public for real‐world applications. However, approaches to single‐modal sentiment analysis neglect the internal connections between textual and visual contents, and current multimodal methods fail to exploit the multilevel semantic relations of heterogeneous features. In this article, the various syncretic co‐attention network is proposed to excavate the intricate multilevel corresponding relations between multimodal data, and combine the unique information of each modality for integrated complementary sentiment classification. Specifically, a multilevel co‐attention module is constructed to explore localized correspondences between each image region and each text word, and holistic correspondences between global visual information and context‐based textual semantics. Then, all the single‐modal features can be fused from different levels, respectively. Except for fused multimodal features, our proposed VSCN also considers unique information of each modality simultaneously and integrates them into an end‐to‐end framework for sentiment analysis. The superior results of experiments on three constructed real‐world datasets and a benchmark dataset of Visual Sentiment Ontology (VSO) prove the effectiveness of our proposed VSCN. Especially qualitative analyses are given for deep explaining of our method. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF