Back to Search Start Over

Towards Holistic Human Evaluation of Automatic Text Simplification

Authors :
Carrer, Luisa
Säuberli, Andreas; https://orcid.org/0000-0001-9613-334X
Kappus, Martin; https://orcid.org/0000-0002-9543-579X
Ebling, Sarah; https://orcid.org/0000-0001-6511-5085
Carrer, Luisa
Säuberli, Andreas; https://orcid.org/0000-0001-9613-334X
Kappus, Martin; https://orcid.org/0000-0002-9543-579X
Ebling, Sarah; https://orcid.org/0000-0001-6511-5085
Source :
Carrer, Luisa; Säuberli, Andreas; Kappus, Martin; Ebling, Sarah (2024). Towards Holistic Human Evaluation of Automatic Text Simplification. In: Proceedings of the Fourth Workshop on Human Evaluation of NLP Systems (HumEval) @ LREC-COLING 2024, Turin, Italy, 21 May 2024. ELRA and ICCL, 71-80.
Publication Year :
2024

Abstract

Text simplification refers to the process of rewording within a single language, moving from a standard form into an easy-to-understand one. Easy Language and Plain Language are two examples of simplified varieties aimed at improving readability and understanding for a wide-ranging audience. Human evaluation of automatic text simplification is usually done by employing experts or crowdworkers to rate the generated texts. However, this approach does not include the target readers of simplified texts and does not reflect actual comprehensibility. In this paper, we explore different ways of measuring the quality of automatically simplified texts. We conducted a multi-faceted evaluation study involving end users, post-editors, and Easy Language experts and applied a variety of qualitative and quantitative methods. We found differences in the perception and actual comprehension of the texts by different user groups. In addition, qualitative surveys and behavioral observations proved to be essential in interpreting the results.

Details

Database :
OAIster
Journal :
Carrer, Luisa; Säuberli, Andreas; Kappus, Martin; Ebling, Sarah (2024). Towards Holistic Human Evaluation of Automatic Text Simplification. In: Proceedings of the Fourth Workshop on Human Evaluation of NLP Systems (HumEval) @ LREC-COLING 2024, Turin, Italy, 21 May 2024. ELRA and ICCL, 71-80.
Notes :
application/pdf, info:doi/10.5167/uzh-259801, English, English
Publication Type :
Electronic Resource
Accession number :
edsoai.on1443058651
Document Type :
Electronic Resource