Back to Search Start Over

InstructCMP: Length Control in Sentence Compression through Instruction-based Large Language Models

Authors :
Juseon-Do
Kwon, Jingun
Kamigaito, Hidetaka
Okumura, Manabu
Publication Year :
2024

Abstract

Extractive summarization can produce faithful summaries but often requires additional constraints such as a desired summary length. Traditional sentence compression models do not typically consider the constraints because of their restricted model abilities, which require model modifications for coping with them. To bridge this gap, we propose Instruction-based Compression (InstructCMP), an approach to the sentence compression task that can consider the length constraint through instructions by leveraging the zero-shot task-solving abilities of Large Language Models (LLMs). For this purpose, we created new evaluation datasets by transforming traditional sentence compression datasets into an instruction format. By using the datasets, we first reveal that the current LLMs still face challenges in accurately controlling the length for a compressed text. To address this issue, we propose an approach named "length priming," that incorporates additional length information into the instructions without external resources. While the length priming effectively works in a zero-shot setting, a training dataset with the instructions would further improve the ability of length control. Thus, we additionally created a training dataset in an instruction format to fine-tune the model on it. Experimental results and analysis show that applying the length priming significantly improves performances of InstructCMP in both zero-shot and fine-tuning settings without the need of any model modifications.<br />Comment: 8 pages, 3 figures, accepted to ACL 2024 Findings (Long Paper)

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2406.11097
Document Type :
Working Paper