1. A multi-feature fusion approach based on domain adaptive pretraining for aspect-based sentiment analysis.
- Author
-
Ma, Yinglong, He, Ming, Pang, Yunhe, Wang, Libiao, and Liu, Huili
- Subjects
- *
SENTIMENT analysis , *LANGUAGE models , *PARSING (Computer grammar) , *DEEP learning , *MACHINE learning , *SEMANTICS - Abstract
Aspect-based sentiment analysis aims to recognize the sentiment polarities for opinion words with the aid of some machine learning or deep learning-based sentiment classification models. Dependency parsing has been considered as an efficient tool for identifying the opinion words in the sentiment text. However, many dependency-based methods might be susceptible to the dependency tree, which inevitably introduces noisy information due to that the rich relation information between words is neglected. In this paper, we propose a multi-feature fusion approach based on domain adaptive pretraining to reduce dependency-based noisy information for aspect-based sentiment classification (ASC). First, we utilize multi-task learning (MTL) for domain adaptive pretraining, which combines biaffine attention model (BAM) and mask language model (MLM) by jointly considering the structure, the relation semantics of edges, and the linguistic feature in the sentiment text. Second, to fully consider these different features affected with each other, a double graph fusion model is proposed, which takes as input the pretrained dependency graph into a message passing neural network (MPNN) initialized with the optimal parameters of the pretrained BAM for training. Lastly, extensive experiments were made against the state-of-the-art competitors over four benchmark datasets, and the results illustrate that our approach outperforms these competitors over most of the four datasets and achieves an accuracy of up to 92.69% and a macro-averaged F1 value of up to 85.79%. The MTL-based domain adaptive pretraining is efficient to achieve high-quality dependency parsing contributing to improving the performance of ASC, while maintaining lower computational cost. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF