1. A syntactic evidence network model for fact verification.
- Author
-
Chen Z, Hui SC, Zhuang F, Liao L, Jia M, Li J, and Huang H
- Subjects
- Humans, Deep Learning, Neural Networks, Computer, Attention physiology, Language, Natural Language Processing, Semantics
- Abstract
In natural language processing, fact verification is a very challenging task, which requires retrieving multiple evidence sentences from a reliable corpus to verify the authenticity of a claim. Although most of the current deep learning methods use the attention mechanism for fact verification, they have not considered imposing attentional constraints on important related words in the claim and evidence sentences, resulting in inaccurate attention for some irrelevant words. In this paper, we propose a syntactic evidence network (SENet) model which incorporates entity keywords, syntactic information and sentence attention for fact verification. The SENet model extracts entity keywords from claim and evidence sentences, and uses a pre-trained syntactic dependency parser to extract the corresponding syntactic sentence structures and incorporates the extracted syntactic information into the attention mechanism for language-driven word representation. In addition, the sentence attention mechanism is applied to obtain a richer semantic representation. We have conducted experiments on the FEVER and UKP Snopes datasets for performance evaluation. Our SENet model has achieved 78.69% in Label Accuracy and 75.63% in FEVER Score on the FEVER dataset. In addition, our SENet model also has achieved 65.0% in precision and 61.2% in macro F1 on the UKP Snopes dataset. The experimental results have shown that our proposed SENet model has outperformed the baseline models and achieved the state-of-the-art performance for fact verification., Competing Interests: Declaration of competing interest We declare that we do not have any commercial or associative interest that represents a conflict of interest in connection with the work submitted., (Copyright © 2024 Elsevier Ltd. All rights reserved.)
- Published
- 2024
- Full Text
- View/download PDF