Back to Search Start Over

Knowledge Graph Completion for the Chinese Text of Cultural Relics Based on Bidirectional Encoder Representations from Transformers with Entity-Type Information.

Authors :
Zhang, Min
Geng, Guohua
Zeng, Sheng
Jia, Huaping
Source :
Entropy. Oct2020, Vol. 22 Issue 10, p1168-1168. 1p.
Publication Year :
2020

Abstract

Knowledge graph completion can make knowledge graphs more complete, which is a meaningful research topic. However, the existing methods do not make full use of entity semantic information. Another challenge is that a deep model requires large-scale manually labelled data, which greatly increases manual labour. In order to alleviate the scarcity of labelled data in the field of cultural relics and capture the rich semantic information of entities, this paper proposes a model based on the Bidirectional Encoder Representations from Transformers (BERT) with entity-type information for the knowledge graph completion of the Chinese texts of cultural relics. In this work, the knowledge graph completion task is treated as a classification task, while the entities, relations and entity-type information are integrated as a textual sequence, and the Chinese characters are used as a token unit in which input representation is constructed by summing token, segment and position embeddings. A small number of labelled data are used to pre-train the model, and then, a large number of unlabelled data are used to fine-tune the pre-training model. The experiment results show that the BERT-KGC model with entity-type information can enrich the semantics information of the entities to reduce the degree of ambiguity of the entities and relations to some degree and achieve more effective performance than the baselines in triple classification, link prediction and relation prediction tasks using 35% of the labelled data of cultural relics. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
10994300
Volume :
22
Issue :
10
Database :
Academic Search Index
Journal :
Entropy
Publication Type :
Academic Journal
Accession number :
146659633
Full Text :
https://doi.org/10.3390/e22101168