Back to Search Start Over

Follow the Wisdom of the Crowd: Effective Text Generation via Minimum Bayes Risk Decoding

Authors :
Suzgun, Mirac
Melas-Kyriazi, Luke
Jurafsky, Dan
Publication Year :
2022

Abstract

In open-ended natural-language generation, existing text decoding methods typically struggle to produce text which is both diverse and high-quality. Greedy and beam search are known to suffer from text degeneration and linguistic diversity issues, while temperature, top-k, and nucleus sampling often yield diverse but low-quality outputs. In this work, we present crowd sampling, a family of decoding methods based on Bayesian risk minimization, to address this diversity-quality trade-off. Inspired by the principle of "the wisdom of the crowd," crowd sampling seeks to select a candidate from a pool of candidates that has the least expected risk (i.e., highest expected reward) under a generative model according to a given utility function. Crowd sampling can be seen as a generalization of numerous existing methods, including majority voting, and in practice, it can be used as a drop-in replacement for existing sampling methods. Extensive experiments show that crowd sampling delivers improvements of 3-7 ROUGE and BLEU points across a wide range of tasks, including summarization, data-to-text, translation, and textual style transfer, while achieving new state-of-the-art results on WebNLG and WMT'16.<br />Comment: https://github.com/suzgunmirac/crowd-sampling

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2211.07634
Document Type :
Working Paper