Back to Search
Start Over
Integrating Graph Contextualized Knowledge into Pre-trained Language Models
- Publication Year :
- 2019
-
Abstract
- Complex node interactions are common in knowledge graphs, and these interactions also contain rich knowledge information. However, traditional methods usually treat a triple as a training unit during the knowledge representation learning (KRL) procedure, neglecting contextualized information of the nodes in knowledge graphs (KGs). We generalize the modeling object to a very general form, which theoretically supports any subgraph extracted from the knowledge graph, and these subgraphs are fed into a novel transformer-based model to learn the knowledge embeddings. To broaden usage scenarios of knowledge, pre-trained language models are utilized to build a model that incorporates the learned knowledge representations. Experimental results demonstrate that our model achieves the state-of-the-art performance on several medical NLP tasks, and improvement above TransE indicates that our KRL method captures the graph contextualized information effectively.<br />Comment: Findings of EMNLP 2020
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.1912.00147
- Document Type :
- Working Paper