Back to Search Start Over

E2EET: From Pipeline to End-to-end Entity Typing via Transformer-Based Embeddings

Authors :
Stewart, Michael
Liu, Wei
Publication Year :
2020

Abstract

Entity Typing (ET) is the process of identifying the semantic types of every entity within a corpus. In contrast to Named Entity Recognition, where each token in a sentence is labelled with zero or one class label, ET involves labelling each entity mention with one or more class labels. Existing entity typing models, which operate at the mention level, are limited by two key factors: they do not make use of recently-proposed context-dependent embeddings, and are trained on fixed context windows. They are therefore sensitive to window size selection and are unable to incorporate the context of the entire document. In light of these drawbacks we propose to incorporate context using transformer-based embeddings for a mention-level model, and an end-to-end model using a Bi-GRU to remove the dependency on window size. An extensive ablative study demonstrates the effectiveness of contextualised embeddings for mention-level models and the competitiveness of our end-to-end model for entity typing.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2003.10097
Document Type :
Working Paper