Back to Search Start Over

STAR: Boosting Low-Resource Information Extraction by Structure-to-Text Data Generation with Large Language Models

Authors :
Ma, Mingyu Derek
Wang, Xiaoxuan
Kung, Po-Nien
Brantingham, P. Jeffrey
Peng, Nanyun
Wang, Wei
Publication Year :
2023

Abstract

Information extraction tasks such as event extraction require an in-depth understanding of the output structure and sub-task dependencies. They heavily rely on task-specific training data in the form of (passage, target structure) pairs to obtain reasonable performance. However, obtaining such data through human annotation is costly, leading to a pressing need for low-resource information extraction approaches that require minimal human labeling for real-world applications. Fine-tuning supervised models with synthesized training data would be a generalizable method, but the existing data generation methods either still rely on large-scale ground-truth data or cannot be applied to complicated IE tasks due to their poor performance. To address these challenges, we propose STAR, a data generation method that leverages Large Language Models (LLMs) to synthesize data instances given limited seed demonstrations, thereby boosting low-resource information extraction performance. Our approach involves generating target structures (Y) followed by generating passages (X), all accomplished with the aid of LLMs. We design fine-grained step-by-step instructions to obtain the initial data instances. We further reduce errors and improve data quality through self-reflection error identification and self-refinement with iterative revision. Our experiments show that the data generated by STAR significantly improve the performance of low-resource event extraction and relation extraction tasks, even surpassing the effectiveness of human-curated data. Human assessment of the data quality shows STAR-generated data exhibits higher passage quality and better align with the task definitions compared with the human-curated data.<br />Comment: To appear at AAAI'24. More info is at https://derek.ma/STAR

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2305.15090
Document Type :
Working Paper