Back to Search Start Over

Copy That! Editing Sequences by Copying Spans

Authors :
Panthaplackel, Sheena
Allamanis, Miltiadis
Brockschmidt, Marc
Source :
Proceedings of the AAAI Conference on Artificial Intelligence. 35:13622-13630
Publication Year :
2021
Publisher :
Association for the Advancement of Artificial Intelligence (AAAI), 2021.

Abstract

Neural sequence-to-sequence models are finding increasing use in editing of documents, for example in correcting a text document or repairing source code. In this paper, we argue that common seq2seq models (with a facility to copy single tokens) are not a natural fit for such tasks, as they have to explicitly copy each unchanged token. We present an extension of seq2seq models capable of copying entire spans of the input to the output in one step, greatly reducing the number of decisions required during inference. This extension means that there are now many ways of generating the same output, which we handle by deriving a new objective for training and a variation of beam search for inference that explicitly handles this problem. In our experiments on a range of editing tasks of natural language and source code, we show that our new model consistently outperforms simpler baselines.<br />Published in AAAI 2021

Details

ISSN :
23743468 and 21595399
Volume :
35
Database :
OpenAIRE
Journal :
Proceedings of the AAAI Conference on Artificial Intelligence
Accession number :
edsair.doi.dedup.....208826f1d422671ed55f9767e82d72fb
Full Text :
https://doi.org/10.1609/aaai.v35i15.17606