Back to Search Start Over

Review of neural approaches for conditional text generation

Authors :
A. A. Marchenko
O. H. Skurzhanskyi
Source :
Bulletin of Taras Shevchenko National University of Kyiv. Series: Physics and Mathematics. :102-107
Publication Year :
2021
Publisher :
Taras Shevchenko National University of Kyiv, 2021.

Abstract

The article is devoted to the review of conditional test generation, one of the most promising fields of natural language processing and artificial intelligence. Specifically, we explore monolingual local sequence transduction tasks: paraphrase generation, grammatical and spelling errors correction, text simplification. To give a better understanding of the considered tasks, we show examples of good rewrites. Then we take a deep look at such key aspects as publicly available datasets with the splits (training, validation, and testing), quality metrics for proper evaluation, and modern solutions based primarily on modern neural networks. For each task, we analyze its main characteristics and how they influence the state-of-the-art models. Eventually, we investigate the most significant shared features for the whole group of tasks in general and for approaches that provide solutions for them. Pages of the article in the issue: 102 - 107 Language of the article: Ukrainian

Details

ISSN :
22182055 and 18125409
Database :
OpenAIRE
Journal :
Bulletin of Taras Shevchenko National University of Kyiv. Series: Physics and Mathematics
Accession number :
edsair.doi...........de5e6ff6f5323fa7b97c94a3be8fb16c
Full Text :
https://doi.org/10.17721/1812-5409.2021/1.13