Back to Search Start Over

Towards Generalized Open Information Extraction

Authors :
Yu, Bowen
Zhang, Zhenyu
Li, Jingyang
Yu, Haiyang
Liu, Tingwen
Sun, Jian
Li, Yongbin
Wang, Bin
Publication Year :
2022

Abstract

Open Information Extraction (OpenIE) facilitates the open-domain discovery of textual facts. However, the prevailing solutions evaluate OpenIE models on in-domain test sets aside from the training corpus, which certainly violates the initial task principle of domain-independence. In this paper, we propose to advance OpenIE towards a more realistic scenario: generalizing over unseen target domains with different data distributions from the source training domains, termed Generalized OpenIE. For this purpose, we first introduce GLOBE, a large-scale human-annotated multi-domain OpenIE benchmark, to examine the robustness of recent OpenIE models to domain shifts, and the relative performance degradation of up to 70% implies the challenges of generalized OpenIE. Then, we propose DragonIE, which explores a minimalist graph expression of textual fact: directed acyclic graph, to improve the OpenIE generalization. Extensive experiments demonstrate that DragonIE beats the previous methods in both in-domain and out-of-domain settings by as much as 6.0% in F1 score absolutely, but there is still ample room for improvement.<br />Comment: EMNLP 2022

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2211.15987
Document Type :
Working Paper