Back to Search Start Over

Improving taxonomic relation learning via incorporating relation descriptions into word embeddings.

Authors :
Huang, Subin
Luo, Xiangfeng
Huang, Jing
Wang, Hao
Gu, Shengwei
Guo, Yike
Source :
Concurrency & Computation: Practice & Experience; 7/25/2020, Vol. 32 Issue 14, p1-16, 16p
Publication Year :
2020

Abstract

Summary: Taxonomic relations play an important role in various Natural Language Processing (NLP) tasks (eg, information extraction, question answering and knowledge inference). Existing approaches on embedding‐based taxonomic relation learning mainly rely on the word embeddings trained using co‐occurrence‐based similarity learning. However, the performance of these approaches is not quite satisfactory due to the lack of sufficient taxonomic semantic knowledge within word embeddings. To solve this problem, we propose an improved embedding‐based approach to learn taxonomic relations via incorporating relation descriptions into word embeddings. First, to capture additional taxonomic semantic knowledge, we train special word embeddings using not only co‐occurrence information of words but also relation descriptions (eg, taxonomic seed relations and their contextual triples). Then, using the trained word embeddings as features, we employ two learning models to identify and predict taxonomic relations, namely, offset‐based classification model and offset‐based similarity model. Experimental results on four real‐world domain datasets demonstrate that our proposed approach can capture additional taxonomic semantic knowledge and reduce dependence on the training dataset, outperforming the state‐of‐the‐art compared approaches on the taxonomic relation learning task. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
15320626
Volume :
32
Issue :
14
Database :
Complementary Index
Journal :
Concurrency & Computation: Practice & Experience
Publication Type :
Academic Journal
Accession number :
144200848
Full Text :
https://doi.org/10.1002/cpe.5696