Back to Search Start Over

A Combined Semantic Dependency and Lexical Embedding RoBERTa Model for Grid Field Relational Extraction

Authors :
Qi Meng
Xixiang Zhang
Yun Dong
Yan Chen
Dezhao Lin
Source :
Applied Sciences, Vol 13, Iss 19, p 11074 (2023)
Publication Year :
2023
Publisher :
MDPI AG, 2023.

Abstract

Relationship extraction is a crucial step in the construction of a knowledge graph. In this research, the grid field entity relationship extraction was performed via a labeling approach that used span representation. The subject entity and object entity were used as training instances to bolster the linkage between them. The embedding layer of the RoBERTa pre-training model included word embedding, position embedding, and paragraph embedding information. In addition, semantic dependency was introduced to establish an effective linkage between different entities. To facilitate the effective linkage, an additional lexically labeled embedment was introduced to empower the model to acquire more profound semantic insights. After obtaining the embedding layer, the RoBERTa model was used for multi-task learning of entities and relations. The multi-task information was then fused using the parameter hard sharing mechanism. Finally, after the layer was fully connected, the predicted entity relations were obtained. The approach was tested on a grid field dataset created for this study. The obtained results demonstrated that the proposed model has high performance.

Details

Language :
English
ISSN :
20763417
Volume :
13
Issue :
19
Database :
Directory of Open Access Journals
Journal :
Applied Sciences
Publication Type :
Academic Journal
Accession number :
edsdoj.570f362b86b746629cb9de48d5f6b97e
Document Type :
article
Full Text :
https://doi.org/10.3390/app131911074