Back to Search Start Over

Improving Factuality of Abstractive Summarization without Sacrificing Summary Quality

Authors :
Dixit, Tanay
Wang, Fei
Chen, Muhao
Publication Year :
2023

Abstract

Improving factual consistency of abstractive summarization has been a widely studied topic. However, most of the prior works on training factuality-aware models have ignored the negative effect it has on summary quality. We propose EFACTSUM (i.e., Effective Factual Summarization), a candidate summary generation and ranking technique to improve summary factuality without sacrificing summary quality. We show that using a contrastive learning framework with our refined candidate summaries leads to significant gains on both factuality and similarity-based metrics. Specifically, we propose a ranking strategy in which we effectively combine two metrics, thereby preventing any conflict during training. Models trained using our approach show up to 6 points of absolute improvement over the base model with respect to FactCC on XSUM and 11 points on CNN/DM, without negatively affecting either similarity-based metrics or absractiveness.<br />Comment: ACL 2023

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2305.14981
Document Type :
Working Paper