Back to Search Start Over

Fine-Grained Sentiment-Controlled Text Generation Approach Based on Pre-Trained Language Model

Authors :
Linan Zhu
Yifei Xu
Zhechao Zhu
Yinwei Bao
Xiangjie Kong
Source :
Applied Sciences, Vol 13, Iss 1, p 264 (2022)
Publication Year :
2022
Publisher :
MDPI AG, 2022.

Abstract

Sentiment-controlled text generation aims to generate texts according to the given sentiment. However, most of the existing studies focus only on the document- or sentence-level sentiment control, leaving a gap for finer-grained control over the content of generated results. Fine-grained control allows a generated review to express different opinions toward multiple aspects. Some previous works attempted to generate reviews conditioned on aspect-level sentiments, but they usually suffer from low adaptability and the lack of an annotated dataset. To alleviate these problems, we propose a novel pre-trained extended generative model that can dynamically refer to the prompt sentiment, together with an auxiliary classifier that extracts the fine-grained sentiments from the unannotated sentences, thus we conducted training on both annotated and unannotated datasets. We also propose a query-hint mechanism to further guide the generation process toward the aspect-level sentiments at every time step. Experimental results from real-world datasets demonstrated that our model has excellent adaptability in generating aspect-level sentiment-controllable review texts with high sentiment coverage and stable quality since, on both datasets, our model steadily outperforms other baseline models in the metrics of BLEU-4, METETOR, and ROUGE-L etc. The limitation of this work is that we only focus on fine-grained sentiments that are explicitly expressed. Moreover, the implicitly expressed fine-grained sentiment-controllable text generation will be an important puzzle for future work.

Details

Language :
English
ISSN :
20763417
Volume :
13
Issue :
1
Database :
Directory of Open Access Journals
Journal :
Applied Sciences
Publication Type :
Academic Journal
Accession number :
edsdoj.1367ba1d2cd42da824419c1a0788f33
Document Type :
article
Full Text :
https://doi.org/10.3390/app13010264