Back to Search Start Over

Graph-Based Cross-Granularity Message Passing on Knowledge-Intensive Text

Authors :
Yan, Chenwei
Fu, Xiangling
You, Xinxin
Wu, Ji
Liu, Xien
Source :
IEEE-ACM Transactions on Audio, Speech, and Language Processing; 2024, Vol. 32 Issue: 1 p4409-4419, 11p
Publication Year :
2024

Abstract

In knowledge-intensive fields such as medicine, the text often contains numerous professional terms, specific text fragments, and multidimensional information. However, most existing text representation methods ignore this specialized knowledge and instead adopt methods similar to those used in the general domain. In this paper, we focus on developing a learning module to enhance the representation ability of knowledge-intensive text by leveraging a graph-based cross-granularity message passing mechanism. To this end, we propose a novel learning framework, the Multi-Granularity Graph Neural Network (MG-GNN), to integrate fine-grained and coarse-grained knowledge at the character, word, and phase levels. The MG-GNN performs learning in two stages: 1) inter-granularity learning and 2) intra-granularity learning. During inter-granularity learning, semantic knowledge is extracted from character, word, and phrase granularity graphs, whereas intra-granularity learning focuses on fusing knowledge across different granularity graphs to achieve comprehensive message integration. To enhance the fusion performance, we propose a context-based gating mechanism to guide cross-graph propagation learning. Furthermore, we apply MG-GNN to address two important medical applications. Experimental results demonstrate that our proposed MG-GNN model significantly enhances the performance in both diagnosis prediction and medical named entity recognition tasks.

Details

Language :
English
ISSN :
23299290
Volume :
32
Issue :
1
Database :
Supplemental Index
Journal :
IEEE-ACM Transactions on Audio, Speech, and Language Processing
Publication Type :
Periodical
Accession number :
ejs67665497
Full Text :
https://doi.org/10.1109/TASLP.2024.3473308