Back to Search Start Over

Principled Gradient-based Markov Chain Monte Carlo for Text Generation

Authors :
Du, Li
Amini, Afra
Hennigen, Lucas Torroba
Yu, Xinyan Velocity
Eisner, Jason
Lee, Holden
Cotterell, Ryan
Publication Year :
2023

Abstract

Recent papers have demonstrated the possibility of energy-based text generation by adapting gradient-based sampling algorithms, a paradigm of MCMC algorithms that promises fast convergence. However, as we show in this paper, previous attempts on this approach to text generation all fail to sample correctly from the target language model distributions. To address this limitation, we consider the problem of designing text samplers that are faithful, meaning that they have the target text distribution as its limiting distribution. We propose several faithful gradient-based sampling algorithms to sample from the target energy-based text distribution correctly, and study their theoretical properties. Through experiments on various forms of text generation, we demonstrate that faithful samplers are able to generate more fluent text while adhering to the control objectives better.<br />Comment: Preprint

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2312.17710
Document Type :
Working Paper