Back to Search Start Over

Entity-level Factual Consistency of Abstractive Text Summarization

Authors :
Nan, Feng
Nallapati, Ramesh
Wang, Zhiguo
Santos, Cicero Nogueira dos
Zhu, Henghui
Zhang, Dejiao
McKeown, Kathleen
Xiang, Bing
Publication Year :
2021

Abstract

A key challenge for abstractive summarization is ensuring factual consistency of the generated summary with respect to the original document. For example, state-of-the-art models trained on existing datasets exhibit entity hallucination, generating names of entities that are not present in the source document. We propose a set of new metrics to quantify the entity-level factual consistency of generated summaries and we show that the entity hallucination problem can be alleviated by simply filtering the training data. In addition, we propose a summary-worthy entity classification task to the training process as well as a joint entity and summary generation approach, which yield further improvements in entity level metrics.<br />Comment: EACL 2021

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2102.09130
Document Type :
Working Paper