Back to Search
Start Over
MKDAT: Multi-Level Knowledge Distillation with Adaptive Temperature for Distantly Supervised Relation Extraction.
- Source :
-
Information (2078-2489) . Jul2024, Vol. 15 Issue 7, p382. 18p. - Publication Year :
- 2024
-
Abstract
- Distantly supervised relation extraction (DSRE), first used to address the limitations of manually annotated data via automatically annotating the data with triplet facts, is prone to issues such as mislabeled annotations due to the interference of noisy annotations. To address the interference of noisy annotations, we leveraged a novel knowledge distillation (KD) method which was different from the conventional models on DSRE. More specifically, we proposed a model-agnostic KD method, Multi-Level Knowledge Distillation with Adaptive Temperature (MKDAT), which mainly involves two modules: Adaptive Temperature Regulation (ATR) and Multi-Level Knowledge Distilling (MKD). ATR allocates adaptive entropy-based distillation temperatures to different training instances for providing a moderate softening supervision to the student, in which label hardening is possible for instances with great entropy. MKD combines the bag-level and instance-level knowledge of the teacher as supervisions of the student, and trains the teacher and student at the bag and instance levels, respectively, which aims at mitigating the effects of noisy annotation and improving the sentence-level prediction performance. In addition, we implemented three MKDAT models based on the CNN, PCNN, and ATT-BiLSTM neural networks, respectively, and the experimental results show that our distillation models outperform the baseline models on bag-level and instance-level evaluations. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 20782489
- Volume :
- 15
- Issue :
- 7
- Database :
- Academic Search Index
- Journal :
- Information (2078-2489)
- Publication Type :
- Academic Journal
- Accession number :
- 178701271
- Full Text :
- https://doi.org/10.3390/info15070382